Content deleted Content added
Line 23:
== Factors driving In-memory products ==
Cheaper and higher performing hardware: According to [[Moore’s law]] the computing power doubles every two to three years while decreasing in costs. CPU processing,
64-bits operating system: Though the idea of In-memory technology is not new, it is only recently emerging thanks to the widely popular and affordable 64-bit processors and declining memory chips prices. [[64 bit]] operating systems allows access to far more RAM (up to 100GB or more) than the 2 or 4 GB accessible on 32-bit systems. By providing Terabytes (1 TB = 1,024 GB) of space available for storage and analysis, 64-bit operating systems make in-memory processing scalable. The use of flash memory enables systems to scale to many Terabytes more economically.
Data Volumes: As the data used by organizations grew traditional data warehouses just couldn’t deliver a timely, accurate and real time data. The extract, transform, load ([[Extract, transform, load|ETL]]) process that periodically updates data warehouses with operational data can take anywhere from a few hours to weeks to complete. So at any given point of time data is at least a day old. In-memory processing makes easy to have instant access to terabytes of data for real time reporting.
|