A Talksum Perspective – Making Big Data Implementations Successful

Alex VarshavskyAlex Varshavsky, CEO, Talksum

In a recent Capgemini Consulting report entitled Cracking the Data Conundrum: How Successful Companies Make Big Data Operational, it was noted that global organizations spent over $31 billion on Big Data in 2013 and the amount of spend is predicted to reach $114 billion in 2018. Nearly 60 percent of the executives surveyed for the report believe that Big Data will disrupt their industry in the next three years.

The report goes on to state, however, that even though these global enterprises embrace Big Data, only 13 percent have achieved full-scale production of their Big Data initiatives. This begs the question – what is keeping them from full-scale production and operational success?

The report revealed these challenges:

  • Scattered silos of data.
  • Ineffective coordination of analytics initiatives.
  • Lack of strong data management and governance mechanisms.
  • Dependence on legacy systems to process and analyze the data.

We’ll look at each of these challenges and how the Talksum data management solution can help overcome them.


Scattered Silos of Data

The report noted that 79 percent of those surveyed had not fully integrated their data sources across the organization. This lack of integration created barriers to a unified view of data that prevented decision-makers from making accurate and timely decisions. In the report, Filippo Passerini, CIO of US-based consumer products leader P&G, said “To move the business to a forward-looking view, we realized we needed one version of the truth. In the past, decision-makers spent time determining sources of the data or who had the most accurate data. This led to a lot of debate before real decisions could be made.”

The Talksum solution handles massive amounts of disparate data that originates from multiple sources. The single RU device ingests the data, sorts and filters it, and makes it useful for a holistic view of the data resources. This allows enterprises to apply business logic early in the process before the data is even stored, and optimizes what needs to be acted upon in real time and what needs to be routed to respective downstream sources. The Talksum Data Stream Router (TDSR) gives all applicable entities a holistic view of the information and breaks down silos that create barriers for knowledge, insights, and action.


Ineffective Coordination of Analytics Initiatives

The Capgemini report noted, “A major stumbling block is a lack of adequate coordination among analytics teams.” Scattered resources and decentralized teams break down best practices that can be shared among the groups. As Eric Spiegel, CEO of Siemens USA, puts it, “Leveraging Big Data often means working across functions like IT, engineering, finance, and procurement, and the ownership of data is fragmented across the organization.”

Different teams often have different analytics systems and BI tools. After the TDSR makes sense out of the incoming information, the TDSR then data reduces, enriches, analyzes, and contextually routes the data – in real time – to where it is supposed to go. Engineering receives the information relevant to them; finance receives the information relevant to them; and so on. In addition, all of the data can be stored for archival purposes in case it is needed later. If multiple organizations need the same data, it is delivered to both. Organizations receive data on a need-to-know basis.


Lack of Strong Data Management and Governance Mechanisms

Coordination and governance become significant when dealing with implementation challenges. Twenty-seven percent of those surveyed sited ineffective governance models for Big Data and analytics. And coordinating data at ingestion and output needs the right data management capabilities. Talksum is a leader in high-speed data processing and management solutions whose mission is to develop new ways of processing, routing, and managing Big Data. This is another way saying that we are a data management company – first and foremost. A significant benefit of the Talksum data management solution is built-in governance. The TDSR includes the foundational components for governance, regulatory compliance, government standards, and policy control.


Dependence on Legacy Systems to Process and Analyze the Data

Of those who responded to the Capgemini survey, 31 percent said their organization was dependent on legacy systems for data processing and management. It’s hard for these organizations to switch to newer systems for fear of incompatibility, of losing data, and of losing time.

The inventors of the Talksum solution have taken this into consideration and built it into the product. The TDSR is compatible with both legacy and newer systems. The unit slides into a server room rack without the need for extensive and complicated coding (it only requires some configuration settings). The unit can output data to both legacy and newer systems, including SharePoint, MySQL, AWS, MongoDB, Hadoop, in-memory options, and others.


Talksum Offers More Than Successful Big Data Implementation

To sum it up, the Talksum solution offers more than making Big Data implementations successful; the Talksum solution can also save enterprises up to 80 percent of what it costs to build and run a data center by providing the following:

  • Operational efficiency.
  • Systems interoperability.
  • Infrastructure footprint reduction.
  • Policy compliance.
  • Added security.

To learn more about these, contact us and we’ll send you more information.

Click here for more information about the Talksum product and its features.


Breaking the Silence: Real-Time Data Management and the Talksum Private Beta

Things have been quiet here on the Talksum blog for quite a while, but it certainly hasn’t been because of lack of activity behind the scenes. Since our last few posts, which have been limited to some technical notes about some of the projects we’ve contributed to, we have been hard at work on two major fronts.

First of all, we’ve been building our product and moving from our research prototype to our commercial offering. On that front, we’re very excited to announce that we’re well under way with the private beta program for our appliance-based Talksum Data Stream platform. We’ve been lucky enough to have the opportunity to work with several major enterprises in different verticals to prepare our product to be let loose in the wild, applied to real world data management and analytic problems. We’re working with Netflow data for optimization of data center topology and disaster recovery strategy. We’re enriching data in real time for optimized integration. We’ve been proving how our real-time data management and data reduction tools can shrink storage and decrease latency. As I said, we’ve been busy.

Secondly, we’ve been using these experiences to really hone our product vision. As we’ve talked to numerous companies and started working with our initial group of beta candidates, we’ve become increasingly passionate about the need for new approaches to “Big Data.”

We see lots of vendors working on how to do analytics on larger data sets, on how to spin enterprise offerings out of hugely distributed cloud approaches or how to re-tool the Hadoop ecosystem to handle lower latency solutions. However, the more we talk to big companies, the more we see that there’s an underlying need to re-assess larger data management practices. That’s where we see Talksum fitting in – we bring real time processing to the core of data management, data integration and data focused solution development.

We’ll be writing much more about Real Time Data Management and the benefits it brings the enterprise. For now, we’re happy to announce our private beta and to tell the world that we’re excited to be coming out of our stealthy period and into the fray!