One of the first approaches to writing software, was to make the software application do everything that is required by the application specification. Thus the earliest machine code applications did everything from handling the I/O (with individual devices) to accomplishing complex processing. As applications became more complex, it dawned on people that the machine code applications were too bulky and hence some parts of them need to percolate down to separate "libraries" that are specialized in solving specific problems (such as I/O for example). These libraries were "packaged" along with the application and became part of it. However, since the same libraries were used across multiple applications, it became increasingly clear that the libraries should always be around when the applications are running so that the applications can have the ability to invoke services of the libraries without explicitly including them in their packaging. Thus the packaged libraries that were hitherto included with every application, became a "layer" over which the application was executing.
Some of these libraries were recognized as so basic, that they should be coupled with the underlying hardware and form what was called as the "Operating system". After all, things like I/O,memory management, processor control etc. were nothing but a bunch of hardware instructions! As operating systems(or O/S as they are called) became more and more complex, a greater part of the application functionality was usurped by them. Hence operating systems started controlling threading, I/O, memory management, processors, network port control, the computer bus etc. Now it is pretty much accepted that applications don't need to do all the heavy lifting with respect to hardware. They instead invoke various operating system libraries using a prescribed API that differed from one O/S to the other.
The next innovation in the space came about when applications needed to provide "services". Special applications called "daemons" were created that started with the operating system and provided various kinds of services to clients. The client server architecture was in full swing! Applications have to wait for a request to come on the wire, be able to spawn a separate process or more often a separate thread to process it and then do the actual processing. The applications had to have all kinds of logic to control the number of threads currently available etc. Further more, most applications used a database. The most expensive operation with the database is to set up and close the database connection. Hence the notion of connection pools was developed. Again, history repeated itself. Since most applications had to do the same kind of processing with regards to supporting multiple threads, database connections, supporting transactions between multiple databases (aka two phase commits) etc. it was natural that they be executed as components in an environment that provided these services.
Thus was born the application server! We entered into the era of "managed" code. Managed code is some code that is written in such a way that it can be managed by other software. The software that does the managing is typically called an application server. Application servers started taking more and more of the functionality out of the software application. Thus transaction management, connection pooling, thread management and pooling became core infrastructural functions provided this time by the application server. The application server sat on a higher level than the operating system since it was essential to have the ability to accomplish tasks across multiple operating systems. There are also frameworks that sit on top of the application server to provide particular functionality such as security, logging, Inversion of control and the like. Applications that were small have now become several layers thick.
Look at this really long stack for instance. I can see people bemoaning about the apparent profligacy in writing such layered applications. But I don't see an application programmer who is proficient enough to address all the concerns that are addressed in these layers. Can you imagine one person being good at application development, databases, UI, security, thread management, transactions and web services! It is necessary to divvy these concerns up between multiple developers which would perforce lead us to the current layered architecture.
Skills started getting "layered" as well. There were database developers, middleware developers, back end script coders, UX developers, Experience developers (the ones who do only HTML/CSS) and UX designers.
Interestingly, as with most trends, we see this trend slightly getting reversed of late. People are looking at full stack developers and polyglot architects. Teams need a mesh of skills including deployment skills. But the basic concepts of layered architecture are here to stay IMO.