Subscribe

Categories

Add to Technorati Favorites

 Subscribe in a reader

Trends and Outliers

TIBCO Spotfire's Business Intelligence Blog

Category Archives: In-Memory Processing

03/11
2014

Big Data Connectivity: Getting the Most Out of Your Data Assets

Big data is a double-edged sword. It offers immense potential for companies to quickly spot customer, market, and other trends that can be used to drive business and operational value.

shutterstock 68545114 150x150 Big Data Connectivity: Getting the Most Out of Your Data AssetsHowever, the steps needed to gather, store, and manage massive data sets can be daunting for IT and business leaders.

At TIBCO Spotfire, we’re continually striving to make it easier for our clients to better leverage the complete range of data assets that are available to them. Today, we’re excited to announce direct connectivity with seven repositories of big data sources.

Continue reading »

Print post

05/07
2013

The Business and Productivity Benefits of In-Memory Analytics

In-memory analytics enable business users to handle significantly higher volumes of data faster than traditional analytics tools.

memory The Business and Productivity Benefits of In Memory AnalyticsThe fact is, users of in-memory analytics are able to process more than three times the volume of data at speeds more than 100 times faster than their competitors, according to a study conducted by Aberdeen Group.

That’s largely because users of in-memory analytics tools are able to access and act on data so much faster than users of traditional analytics systems since in-memory technologies can avoid latency issues.

Continue reading »

Print post

04/30
2013

In-Memory Analytics: Taming the Big Data Storage Beast

These days companies are seeing greater volumes of customer, ERP, and other types of data streaming into their organizations. And this is placing an immense burden on their storage systems.

a 610x408 In Memory Analytics: Taming the Big Data Storage Beast For its part, IDC forecasts that global digital data growth is expected to undergo 50-fold growth from 2010 to 2020. Meanwhile, the volume of business data is growing at an average rate of 36% per year, according to research by Aberdeen Group.

The three key challenges often associated with big data are the “3 Vs”: volume, velocity, and variety, as Aberdeen and other industry experts note.

Continue reading »

Print post

08/23
2012

Using Big Data to Make Big Money

dollar sign finance 150x150 Using Big Data to Make Big MoneyOne of the greatest strengths of big data is how it can be used to provide fresh insights to decision makers.

Big data can reveal customer and market trends that, when spotted quickly enough, can lead executives to move rapidly on new business ventures ahead of competitors.

The beauty of the increasingly digital landscape is that every single digital interaction that occurs – from scheduling a doctor’s visit via email to entering a chat session with a customer agent regarding a product issue to commenting about a brand experience on Facebook – becomes an electronic record that companies can make use of, notes Inc. magazine reporter J.J. McCorvey in a recent blog.

And businesses can now take advantage of this “treasure trove” of data thanks, in part, to dramatic reductions in the costs of storage, the ability to integrate data streams from multiple sources, and the rich analytics tools that are available to mine and analyze this information,” says Todd Nash of Chicago Business Intelligence Group.

Continue reading »

Print post

06/26
2012

Transform 2012 – Moving Business from Survival to Thrive Mode

marketing 1 150x150 Transform 2012   Moving Business from Survival to Thrive ModeToday marks the start of the TIBCO Transform 2012 user conference in London. This sold-out event features @Gartner_Inc’s James Richardson on the business value of in-memory BI.

Here’s what nearly 1,000 participants can expect to gain from the conference and how you can benefit from your seat in cyberspace.

This session focuses on why traditional BI systems users are moving to visualization-based data analytics platforms to gain competitive advantages. Session topics include drivers, trends, best practices and how IT can better address the analytics needs of the enterprise.

Other conference highlights include an examination of the major challenges in organizations and how transforming to an event-enabled enterprise can move the organization from survival mode to thrive mode.

Participants at both today’s event and the Thursday event in Paris will learn how to use TIBCO’s platform for making sense of tons of data, how to find “what’s hiding in their data” and how to respond to these new insights in real time.

Customers from companies such as Sony and Dogs for the Blind join representatives from TIBCO to give real-world insights and practical advice on how others can transform their businesses.

Next Steps:

Amanda Brandon
Spotfire Blogging Team

Print post

02/09
2012

In-Memory Analytics Tools To Take Center Stage In 2012

speed 150x150 In Memory Analytics Tools To Take Center Stage In 2012To play off one of the quotes uttered in the 1986 Tom Cruise blockbuster “Top Gun,” “I feel the need . . . the need for speed.” And when it comes to in-memory analytics, that’s the name of the game. According to Cindi Howson (@biscorecard), In-memory technology remained prominent in 2011 and will continue to capture our full attention in 2012, with its ability to provide speed-of-thought analysis on ever-increasing amounts of data.

Of course, as Forrester Research’s Boris Evelson (@bevelson) posts in a blog on the topic, not all in-memory analytics tools are created equal. Although Evelson’s blog dates back to Sept. 2010, the words continue to hold true today. As he notes, there are distinct differences between in-memory indexes, in-memory OLAP, in-memory ROLAP, and other approaches.

As Evelson further notes, while there are a raft of commodity features worth comparing, such as data integration and portal integration, it’s also important to evaluate the different features that various tools tout. These include compression ratios for in-memory data tools as well as load speeds plus the ability to combine advanced analytics with traditional BI reporting and analysis.

Continue reading »

Print post

12/07
2010

How Does In-Memory Processing Work?

j03995511 150x150 How Does In Memory Processing Work?Understanding the basics of in-memory processing is as easy as learning your ABC’s:

A:  What is it?

In-memory processing is a fairly simple yet very powerful innovation.  Here’s how it works:

Retrieving data from disk storage is the slowest part of data processing.  And the more data you need to work with, the more the retrieval step slows down the analytics process.  The usual way of addressing this time problem has been to pre-process data in some way (cubes, query sets, aggregate tables, etc.) so the computer can “go get” a smaller number of records.  But those approaches typically require guessing in advance what data should be selected, and how it should be arranged for analysis.  If/when the analyst needs more or different data, it’s back to the drawing-board.

Continue reading »

Print post