Old models of computing always tend to linger too long, but client-server was based on a fallacy — and needs to go away sooner rather than later

I’m sure we — that is, me and the LinkedIn or Twitter spheres — can quibble over the definition of client-server versus the model I’ll call “purely distributed.” So allow me to define client-server as one or more clients connected to a server listening on a pool or set of sockets that mainly scales vertically and usually has a central data store. This is the model of the LAN.

I’ll define the distributed model as N-clients or peers connected to a mesh of N servers that mainly scale horizontally and use a data store or stores that also shard and distribute processing. This model is built to tolerate failure and demand spikes, enabling you to add more nodes (often linearly) and relocate infrastructure at will. This is the model of the cloud.

Detail Reading Source Link: http://www.infoworld.com

Good comments:

This article is as misinformed as the NoSQL evangelists predicting (or demanding) the death of RDBMS. Any time you think a fundamental computing model is outdated, 99% of the time you are showing your colors as somebody who has *no* *idea* *how* *business* *or* *technology* *actually* *work*. Somebody below mentioned games. This is a great example. Pushing AAA games all the way into the cloud, for example, makes no sense. Distributing the computation to the client (console, PC, etc.) is the only way to get the graphics and sound we expect nowadays in a modern game. The server then becomes a gateway for multiplayer. If the entire game had to be computed entirely in the cloud, it would be unbelievably expensive, completely choke the internet, and be a terrible end user experience. Putting aside games for a minute, even in business, this doesn’t always make sense. Not all problems can be solved efficiently exactly the same way and you shouldn’t pigeon-hole yourself just because some out-of-touch author on the internet decides Client-Server is SOOOO yesterday. Use Client-Server when it makes sense. Use cloud when it makes sense. Use local installations with no server when it makes sense. Stop following trends and engineer your software in the way that makes the most sense!

I am sorry but as one who went through the entire client-server phase in our profession from the start until it was superseded by the Internet I have to say that the author of this piece has absolutely no idea what he is talking about.

Client-Server, technically was in fact designed for the implementation of distributed systems and still works quite nicely. The technical concept of client server is that database operations are performed at the database, most often a remote server, while interface processes are done at a client. Business and logic components are then compartmentalized into assemblies that can be either placed on a separate set of remote servers that will interact with both the client and the database thus “distributing” the processing load or has been done more recently with web applications applied to the same server as the application server.