A Talksum Perspective – Making Big Data Implementations Successful

Alex VarshavskyAlex Varshavsky, CEO, Talksum

In a recent Capgemini Consulting report entitled Cracking the Data Conundrum: How Successful Companies Make Big Data Operational, it was noted that global organizations spent over $31 billion on Big Data in 2013 and the amount of spend is predicted to reach $114 billion in 2018. Nearly 60 percent of the executives surveyed for the report believe that Big Data will disrupt their industry in the next three years.

The report goes on to state, however, that even though these global enterprises embrace Big Data, only 13 percent have achieved full-scale production of their Big Data initiatives. This begs the question – what is keeping them from full-scale production and operational success?

The report revealed these challenges:

  • Scattered silos of data.
  • Ineffective coordination of analytics initiatives.
  • Lack of strong data management and governance mechanisms.
  • Dependence on legacy systems to process and analyze the data.

We’ll look at each of these challenges and how the Talksum data management solution can help overcome them.

 

Scattered Silos of Data

The report noted that 79 percent of those surveyed had not fully integrated their data sources across the organization. This lack of integration created barriers to a unified view of data that prevented decision-makers from making accurate and timely decisions. In the report, Filippo Passerini, CIO of US-based consumer products leader P&G, said “To move the business to a forward-looking view, we realized we needed one version of the truth. In the past, decision-makers spent time determining sources of the data or who had the most accurate data. This led to a lot of debate before real decisions could be made.”

The Talksum solution handles massive amounts of disparate data that originates from multiple sources. The single RU device ingests the data, sorts and filters it, and makes it useful for a holistic view of the data resources. This allows enterprises to apply business logic early in the process before the data is even stored, and optimizes what needs to be acted upon in real time and what needs to be routed to respective downstream sources. The Talksum Data Stream Router (TDSR) gives all applicable entities a holistic view of the information and breaks down silos that create barriers for knowledge, insights, and action.

 

Ineffective Coordination of Analytics Initiatives

The Capgemini report noted, “A major stumbling block is a lack of adequate coordination among analytics teams.” Scattered resources and decentralized teams break down best practices that can be shared among the groups. As Eric Spiegel, CEO of Siemens USA, puts it, “Leveraging Big Data often means working across functions like IT, engineering, finance, and procurement, and the ownership of data is fragmented across the organization.”

Different teams often have different analytics systems and BI tools. After the TDSR makes sense out of the incoming information, the TDSR then data reduces, enriches, analyzes, and contextually routes the data – in real time – to where it is supposed to go. Engineering receives the information relevant to them; finance receives the information relevant to them; and so on. In addition, all of the data can be stored for archival purposes in case it is needed later. If multiple organizations need the same data, it is delivered to both. Organizations receive data on a need-to-know basis.

 

Lack of Strong Data Management and Governance Mechanisms

Coordination and governance become significant when dealing with implementation challenges. Twenty-seven percent of those surveyed sited ineffective governance models for Big Data and analytics. And coordinating data at ingestion and output needs the right data management capabilities. Talksum is a leader in high-speed data processing and management solutions whose mission is to develop new ways of processing, routing, and managing Big Data. This is another way saying that we are a data management company – first and foremost. A significant benefit of the Talksum data management solution is built-in governance. The TDSR includes the foundational components for governance, regulatory compliance, government standards, and policy control.

 

Dependence on Legacy Systems to Process and Analyze the Data

Of those who responded to the Capgemini survey, 31 percent said their organization was dependent on legacy systems for data processing and management. It’s hard for these organizations to switch to newer systems for fear of incompatibility, of losing data, and of losing time.

The inventors of the Talksum solution have taken this into consideration and built it into the product. The TDSR is compatible with both legacy and newer systems. The unit slides into a server room rack without the need for extensive and complicated coding (it only requires some configuration settings). The unit can output data to both legacy and newer systems, including SharePoint, MySQL, AWS, MongoDB, Hadoop, in-memory options, and others.

 

Talksum Offers More Than Successful Big Data Implementation

To sum it up, the Talksum solution offers more than making Big Data implementations successful; the Talksum solution can also save enterprises up to 80 percent of what it costs to build and run a data center by providing the following:

  • Operational efficiency.
  • Systems interoperability.
  • Infrastructure footprint reduction.
  • Policy compliance.
  • Added security.

To learn more about these, contact us and we’ll send you more information.

Click here for more information about the Talksum product and its features.

 

A Talksum Perspective – Understanding Data Early in the Process

Barry Strauss, Talksum Head of MarketingBarry Strauss, Head of Marketing, Talksum

With the amount of data increasing at exponential rates, you would think that businesses today would have it made, that they would be overwhelmed with information, and that they could apply information to problems on the spot, react instantaneously, and even be more proactive than ever. Unfortunately, this is most often not the case.

The inordinately massive amount of available information has ironically put a strain on the enterprise, more specifically on the network infrastructure. Big Data that gets bottlenecked and can’t be served properly becomes responsible for down time, operational inefficiencies, potential disasters, policy violations, lost opportunities, higher costs, security violations, and other problems.

Ultimately, the ineffective infrastructure triggers financial losses that occur because of the inability to make real-time decisions with the data.

The traditional approach to fixing this problem is to first store data, and then to make something out of it. The focus of innovation is to make storage bigger and faster (improve traditional databases and build new storage solutions such as Hadoop, in-memory databases, and others) and build analytics on top of it. This becomes complex and expensive to implement and also raises concerns in scalability and stability.

Talksum has tackled this problem with an efficient approach – understanding data and then acting upon it in real time before storing. This allows enterprises to apply business logic early in the process before data is stored, and optimize what needs to be acted upon in real time and what needs to be routed to respective downstream sources.

Again, the key point of the Talksum approach is to understand the data first before it is stored, as opposed to today’s approach of storing first, and then trying to figure out what is in the data.

At the end of the day, it all comes down to saving $$$$ and eliminating financial losses. The underlying, innovative approach built into the Talksum product allows a single Talksum Data Stream Router (TDSR) to replace racks of servers in the data center. This drastically reduces the infrastructure footprint, staffing and support needs, energy consumption, the number of required software licenses, and other costs. At the same time, the TDSR reduces the number of steps and the amount of time (from days or weeks to seconds and minutes) needed to make timely decisions, create reports, build charts, and take appropriate actions.

The Talksum solution also helps data centers solve one of their biggest challenges – dealing with multiple logging formats and data schemas – by providing a universal logging profiler, as well as other systems interoperability problems.

The TDSR performs all of this while enforcing security and policy compliance.

To sum it up, the TDSR reduces many solutions that apply to challenges and problems to a single solution that covers all. And it does this in real time at network speeds.

 

Click here for more information about the Talksum product and its features.

Data Science as a Solution

Dale Russell, Talksum CTODale Russell, CTO, Talksum

As our understanding of data science problems evolves, we find that effective solutions apply a systematic approach to testing, measuring, and building knowledge of the whole data system. In order to effectively and efficiently create this holistic view of data, first consider the entirety of the data landscape from Infrastructure to Layer 7. A comprehensive data science solution should not have biased access to data from any one layer more than another. When architecting a solution, keep in mind that business requirements will change, message types and objects will change, and the volume of data from various OSI layers will change, especially as the Internet of Things (IoT) becomes more of a reality.

To best deal with an ever-changing data landscape, follow this important principle: Never leave work for a downstream process. Datasets will continue to grow in volume and diversity, and solutions will be expected to take less time to process data or make it actionable. Store-and-sort is a costly strategy regardless of who owns the infrastructure. We found the best approach is to sort first, then store.

Over the last 15 years, exceptional and innovative storage solutions have been developed utilizing distributed software and socket libraries and advanced cloud services. These come with substantial performance increases, benefiting data center environments where concerns about latency, growing storage, or increased demand for analytics on datasets arise. As innovations in this sector brings more data into your landscape, you can enable great data science by taking a broader approach.

While some solutions focus on a subset of problems, a great data science solution deals with the entirety of information across the data landscape. In working with our customers and partners, we found that any acceptable solution must not only accommodate changing data requirements, it must do so in a manner that maintains the highest level of data fidelity. If new analytical processes are created, the solution should easily direct the correct data streams to new processes without a lot of work for your team.

A proper data science solution empowers the organization to focus on asking forward-looking questions of their data, not requiring them to constantly invest time searching for new data solutions every time the data landscape changes (as it will continue to do).

 

 

Talksum Secures Large University Installation

Alex VarshavskyAlex Varshavsky, CEO, Talksum

This week, we secured several installations as part of our strategic initiative to help large data centers use the Talksum Data Stream Router (TDSR) in their facilities to manage their Big Data projects. The Talksum solution helps data centers by streaming data, including actionable alerts – from disparate data ingest into the TDSR to different types of targeted storage and BI platforms that are currently in use by the data center.

A large university that is based in the United States is using the TDSR for aggregating log and other data, and contextually routing it, often in the form of alerts, to multiple back-end repositories.

icon_universityAt the university installation, the TDSR is aggregating logs and, based on the content, contextually routing the information to multiple areas, including agriculture and biosciences; engineering and physical sciences; business and technology; marine and ocean sciences; health and behavioral science; sustainability and the environment; and space science departments.

Using the TDSR, the university can act on varied data from multiple sources in real time through automated responses. The TDSR allows users to quickly move data where it is needed and in the format it is needed. It helps to streamline service delivery and to boost overall data performance – all without the need to change the infrastructure.

To learn more about the Talksum solution, and how the TDSR can help you, contact our Sales Team.

Or E-mail your request to info@talksum.com specifying your company needs, its geographic location, and your contact information. We will contact you shortly after your request has been submitted.

U-2 Spy Plane Causes LAX to Shut Down; Could Have Been Avoided

Alex VarshavskyAlex Varshavsky, CEO, Talksum

A few weeks ago, I was at the San Jose Airport (SJC) waiting for a commuter flight to Los Angeles (LAX) when a voice came over the loud speaker to announce that the flight had been cancelled. Passengers were told to take a shuttle bus to the San Francisco Airport (SFO) and proceed from there. The next day, I searched for reasons why the flight was cancelled and found out that data from a U-2 spy plane’s flight plan “confused software” that helps track and route aircraft around the region.

When that system failed, a backup helped safely guide flights already in the air, but hundreds of flights across the nation headed for Southern California were either cancelled or delayed as the air traffic control facility north of Los Angeles effectively rebooted.

U-2 Spy Plane Causes LAX to Shut Down; Could Have Been AvoidedIn an Associated Press article, it was reported that the spy plane, which was conducting training operations in the area, flies at about 60,000 feet under “visual rules.” According to the FAA, a computer perceived a conflict between the altitude and the use of visual flight rules, and began trying to route the plane to 10,000 feet to avoid flight collisions. The number of adjustments that would need to be made to the routes of other planes throughout the area overwhelmed the software.

When the system went down, air traffic controllers had to manually call their counterparts at other centers to update them on each plane’s flight plan.

The Talksum Data Stream Router (TDSR) handles cases such as this through its real-time, cross-domain data management function. Not only would the TDSR ingest data, filter it, and route the alerts, in real time, to the various air traffic controllers, it would also reduce the data so as not to overload the system and overwhelm the software and, simultaneously, route the complete data stream to another database for analysis and archiving.

A single TDSR processes millions of complex events per second, in real time, and simplifies the data management process by making it easy to monitor, analyze data, and send alerts in real time while significantly reducing the cost of post-acquisition, ETL integration, and distribution. It is highly configurable without the need for specialized coding to deploy highly specialized solutions such as this. In addition, the Talksum Data Stream Router includes foundational components for regulatory compliance, government standards, and policy control.

Click here for more information about the TDSR, or visit our Contact Us form to request a meeting.