How Big Data Solutions Help Cyber Security at Reduced Costs

Alex VarshavskyAlex Varshavsky, CEO, Talksum

In a recent Beacon Report entitled Balancing the Cyber Big Data Equation, it was pointed out that Big Data is showing great promise on many fronts, including combat against fraud, waste, and abuse; improvement to our nation’s health; and capabilities to support cyber missions.Talksum Solutions Help Cyber Security

In addition, according to a webinar last summer entitled “Smarter Uncle Sam: The Big Data Forecast,” panelists identified what they felt were the top three priorities for government focus – enhanced cyber security, combining structured and semi-structured data, and using video/data visualization. In this blog, we’ll take a look at their top-most priority – cyber security and how Big Data can benefit.

According to the government-focused webinar panelists, a few – among many – applications where Big Data helped cyber security, included:

  • Postal Services: Uses Big Data today for the indicia – the postage block located in the upper right-hand corner of mail packages – to detect fraud. The Post Office loses millions of dollars of revenue because of criminals who are duplicating indicia marks on packages.
  • International Attacks: Uses Big Data to formulate bubble maps showing attacks from different nations. In the example given at the webinar, Big Data had been used to create a visual map that showed attacks from nations within a 20-minute time period. By marrying Big Data to visualization, a panelist noted that the agency could “see” who was attacking the most frequently in near real time.
  • Air Traffic: Uses Big Data to look at the routes airlines use and the density caused by the number of flights occupying the air space and other data that includes cyber security information. Officials can use this information to increase the efficiency and safety of airports.

In the Beacon Report, it was noted that Big Data and cyber security, together, “have the potential to fundamentally reinvent broken and siloed Federal information technology (IT).”  The report goes on to emphasize that there is tremendous value in the data currently segregated across the Federal government, but that agencies lack both infrastructure and policy to enable correlation, dissemination, and protection.

The good news today is that a solution does exist. The Talksum Data Stream Router (TDSR) was built for this – fixing broken and siloed information coming from multiple sources and in disparate file structures while enforcing strict cyber protection and detection capabilities. The TDSR ingests disparate data, aggregates, filters, data reduces (and eliminates data not of use or that grows stale at the collection point), and intelligently routes pertinent information down stream only to the designated recipients who have the need-to-know for their specific applications. In addition, since the TDSR works via machine-to-machine without personally identifiable information (PII), there is never a breach in personal identity. The solution allows for enhanced data and information sharing over a secure network.

In regard to challenges related to outputs from different agencies and vendors, the TDSR ingests any type of information and normalizes it so that the information “talks” the same language, creating a holistic approach that hinges on commonality at the receiving sources, which may include different BI and analytical tools, databases, and other storage devices. The system can immediately implement policy changes as they arise.

Furthermore, the Talksum solution is a fraction of the cost of comparable systems and its simplicity reduces the need for manpower and server requirements – from thousands of servers to two.

In this age of severe budget cuts, the TDSR solution allows agencies to take full advantage of both Big Data opportunities and the management of cyber threats.

The new approach taken by Talksum, in summary, offers government agencies a different way to provide for real-time, actionable insights and at the same time perform cyber security functions while reducing costs and the need for expensive servers and manpower.

 

 

Talksum to Present at CeBIT on Datability and Using Large Volumes of Data Sustainably and Responsibly

Barry Strauss, Talksum Head of MarketingBarry Strauss, Talksum Head of Marketing

We’re looking forward to attending CeBIT 2014, March 10-14, in Hannover, Germany, as both a presenter with Skolkovo on the CeBIT Open Stage, and as a participant on the CeBIT Global Conferences Big Data Expert Panel.

The theme for this year’s staging of CeBIT has been announced as Datability, and it’s all about using large volumes of data sustainably and responsibly. According to CeBIT organizers, “With Datability as its lead theme and motto, CeBIT will serve as the digital industry’s central hub for presenting, discussing, and exploring the many and varied options and opportunities for the intelligent use of Big Data.”

With the volume of total data within the digital universe growing by an average of 50 percent annually, we have to look at how good we are at using the available information. In the Talksum presentation, we will address different historical events that could have been dramatically altered by using the massive amounts of data available at the time. We’ll look at oil and gas, space, transportation, and public infrastructure, and dissect each. Then, we’ll create “what if” scenarios and how data management solutions such as the Talksum Data Stream Router could have taken full advantage of the data to improve the evens and/or avert catastrophes.

Talksum provides a new approach to data management and analytics with a focus on speed, simplicity, and value – all three which are essential ingredients to today’s world of Big Data and how to use it sustainably and responsibly. At the event, we will discuss the architecture that makes this possible.

If you are attending CeBIT this year and would like to meet with a Talksum team member, let us know by filling out the contact form on this website. If you cannot make the event and would like to speak with a Talksum team member either in Hannover or the vicinity, let us know and we’ll be sure to schedule a respective meeting.

See you soon at CeBIT!

 

New Mexico Consortium and Los Alamos Labs Big Data Management Workshop

Five Minutes With Alex VarshavskyAlex Varshavsky, CEO, Talksum

Last week, we had the pleasure of leading a workshop for the New Mexico Consortium (NMC) and the Los Alamos National Laboratory (LANL) about Big Data management at the data center level.

At the workshop, we discussed how Big Data initiatives need more than just new storage platforms and BI solutions, and how they need new approaches to data architectures and management strategies. The reason for these needs can be broken down into following:

  • Operate in real time to keep pace with velocity.
  • Adaptive to meet changing requirements.
  • Simple to use to avoid specialized skills and custom code.
  • Low overhead in terms of people, time, and infrastructure.

Enterprises are overloaded with data. In fact, as highlighted in a Marcia Conner blog post, every day, more than 2.5 quintillion bytes of data (1 followed by 18 zeros) are created, with 90 percent of the world’s data created in the last two years alone. As a society, we are producing and capturing more data every day than was seen by everyone since the beginning of earth.

Most of this data filters through data centers, and along with the data come Big Data problems, such as:

  • How do you reasonably ingest, transform, route, and analyze data in real time?
  • How can you apply more logic earlier in the pipeline, while minimizing ingest performance impact?
  • How can you begin to create a holistic view of the information in the data so that you can correlate events from multiple domains?

These and other problems were covered with use cases, and we followed up with solutions to those use cases. We were excited to show how the Talksum Data Stream Router™ (TDSR™) handles the above and showed examples using their own data. The TDSR can ingest multiple transport and application protocols, as well as multiple formats, and convert the incoming data into parallel data streams, or events messages, that are aggregated, filtered, and then contextually routed to their respective end points (summarized data, aggregated data, dynamic streams, data stores, data warehouses, log storage, and so on).

It was a fruitful workshop and we hope to present at many more.

 

Infographic: A Disruptive Approach to an Emerging Big Data Market

Alex VarshavskyAlex Varshavsky, CEO, Talksum

As illustrated in our latest infographic (shown below), A Disruptive Approach to an Emerging Big Data Market, and as the title implies, Big Data covers a lot of ground and affects huge, growing markets – both at the horizontal IT and data center level, and at the higher level vertical applications, which are tied to the data centers. Every day, 2.5 quintillion bytes of data (1 followed by 18 zeros) are created. Looking at the IT market, worldwide spending will hit $3.7 trillion in 2014, and data center infrastructures spend alone is estimated to reach $152 billion in 2016.

Talksum Approach to an Emerging Big Data Market

Along with these staggering numbers come problems, such as the following:

  • Latency: It’s hard to process massive amounts of data from many disparate sources in a timely manner.
  • Complexity: Networking solutions have become complex and require highly specialized skills.
  • Costs: Rising costs (people, time, and infrastructure) prohibit current solutions.

On the bright side, there is a solution. With the Talksum Data Stream Router™ (TDSR™), problems are solved and solutions are implemented:

  • Speed: The TDSR operates in real time and keeps pace with today’s velocity.
  • Simplicity: The TDSR simplifies the data management process and avoids the need for specialized skills and custom code.
  • Value: The TDSR offers high efficiency and low overhead in terms of people, time, and infrastructure.

Talksum IT and Data Center solutions provide the following:

  • Network and monitoring optimization.
  • Scalability and modularity.
  • Unified services across platforms and applications.
  • Disaster recovery optimization and reusability.
  • A way to capitalize on existing assets.
  • A green, Eco-friendly solution to data management.

In addition, the TDSR can be used for any vertical application that requires massive amounts of processing at the data center, including, for example, media, energy, oil and gas, weather, retail, financial, government, and transportation, to name a few.

For a full-sized view of the Talksum infographic, click here.