[Mainframe] The ultimate answer to the handling of big data: the mainframe
Why is the trusty old mainframe still relevant? This survey of 187 IT pros tells the tale.
Quick question: What is more than 60 years old, but still spry enough to beat the best mid-range clusters at big data?
If your answer was the mainframe, then you’re correct.
If you believe that you don’t use mainframe computers, think again. If you put a debit card into an ATM, then you’re using a mainframe. No, the ATM isn’t a mainframe, but it’s connected to one. Big companies still rely on the mainframe and they still rely on the data, the security, and the scalability that remains unmatched by any other platform or cluster of platforms in computing. To prove that it’s worthy of new development and new research, IBM’s new z13 mainframe was developed at a cost of five years and $1 billion. And no one but T. Boone Pickens or Bill Gates can afford to invest a billion bucks into a hobby.
For big data, nothing comes close to the processing power of the mainframe computer. If you don’t believe me, you can download a white paper on the topic and check the facts for yourself or check out a video by Syncsort (The folks who supplied the survey data for this post). You can also grab a copy of a Forrester report on the topic big data analytics for IT.
Detail Reading Source: http://www.zdnet.com