Processing and analysing data calls for great computing power and advanced technology stacks. The evolution we witnessing today has been gradual. In this blog, Narendra Shukla reminisces this journey with us.

Ingesting massive amount of data requires exceptionally fast computing power. The appetite for radically advancing the computing speed increased with the need for processing Yottabytes of data. Big data has long been mainstream and has become ubiquitous. Companies like Facebook, Twitter now process Terabytes and Petabytes of data in real time. Technologies like Hadoop, Map-Reduce have revolutionized the IT industry. Apache Spark, with its in-memory engine, has sped up big data processing by 100x.


Read More...