Content deleted Content added
tried to clarify some grammar, clean up redundancies and advertising, added wikilinks, etc. for the first ~4 sections |
Undid revision 571471138 by Winged watermelon (talk); I felt that in trying to get away from redundancies & advertising, I deviated from the MOS in another direction |
||
Line 5:
{{Ad|date=June 2013}}
}}
{{merge|In-memory database|discuss=Talk:
== Definition ==
With businesses demanding faster and easy access to information in order to make reliable and smart decisions, In-memory processing is an emerging technology that is gaining attention. It enables users to have immediate access to right information which results in more informed decisions. Traditional [[Business Intelligence]] (BI) technology loads data onto the
== Traditional
Historically, every computer has two types of data storage mechanisms
Though SQL is a very powerful tool
== Disadvantages of traditional
To avoid performance issues and provide faster query processing when dealing with large volumes of data, organizations needed optimized database methods like creating [[index (database)|index]]es,
The point of having a data warehouse is to be able to get results for any queries asked at any time.
Traditional BI tools ==
The arrival of [[Column-oriented_DBMS|column centric databases]]
Most in-memory tools use
== Factors driving
Cheaper and higher performing hardware: According to [[Moore’s law]] the computing power doubles every two to three years while decreasing in costs. CPU processing, memory and disk storage are all subject to some variation of this law. Also hardware innovations like multi-core architecture, NAND flash memory, parallel servers, increased memory processing capability, etc. and software innovations like column centric databases, compression techniques and handling aggregate tables, etc. have all contributed to the demand of In-memory products.<ref>{{cite web|last=Kote|first=Sparjan|title=In-memory computing in Business Intelligence|url=http://www.infosysblogs.com/oracle/2011/03/in-memory_computing_in_busines.html}}</ref>
Data Volumes: As the data used by organizations grew traditional data warehouses just couldn’t deliver a timely, accurate and real time data. The extract, transform, load ([[Extract, transform, load|ETL]]) process that periodically updates data warehouses with operational data can take anywhere from a few hours to weeks to complete. So at any given point of time data is at least a day old. In-memory processing makes easy to have instant access to terabytes of data for real time reporting.
Reduced Costs: In-memory processing comes at a lower cost and can be easily deployed and maintained when compared to traditional BI tools. According to Gartner survey deploying traditional BI tools can take as long as 17 months. Many data warehouse vendors are choosing In-memory technology over traditional BI to speed up implementation times.
|