Intelligent Transportation Ecosystem

IoT, ITS, and the Talksum Ecosystem

We want self-driving smart cars!

Dale Russell, CTO, Talksum, San Jose

 

We’ve been hearing a lot about of Internet of Things (IoT) for the last decade. IoT is coming, IoT is here, IoT and Big Data…

 

Many of the people talking about IoT are from the computer industry. They’re selling us on a new market valued around $1.7 Trillion in 2020. While convincing us to invest in the IoT sector they will then point to a repackaging of a computer or data center solution. You can buy several brands of set top-media players listed as an IoT device. Upon further inspection they’re a small computer with a HDMI connector and a wireless network card connecting to a web service. Smart watches are but Bluetooth devices for your smart phone; a computer accessory. We have gone from Internet Hosting, Web Services, Cloud Services, and now IoT by  just a release of smaller computers and distributed web services?

 

That’s beginning to change and the consumers will be driving the market so to speak. We want self-driving smart cars! Automobile manufactures old and new are all working on developing self-driving modules, most governments want Intelligent Transportation Systems (ITS), and we as the consumer want gadgets!

 

What gadget is cooler than self-driving cars?

 

The consumer’s desire is to also bring infotainment and entertainment, social groups, smart devices, as well as personal data contracts and other services along for the ride. When you combine all the wants of the consumer we have the beginning of the Digital Transportation Eratm. We want self driving cars, traffic re-routing for first responders, reduced traffic congestion, quieter roads, and cleaner air. With all of these wonderful goals the automobile has become the consumers’ mobile data center bringing all of the same security concerns as an internet data center and many more. Automobile manufacturers, government and regulatory bodies, insurance and telematics services, and internet media providers all bring their own protocols into this new mobile data center. The new Smart Car will need to be considered part of the larger ITS and IoT clouds.

Intelligent Transportation Ecosystem

To meet the challenges of the Digital Transportation Eratm the vehicle needs to be an integral part of these new ecosystems. The Intelligent Transportation Systems will include new networks, data management, and new standards and protocols such as: Vehicle to Vehicle (V2V), Vehicle to Infrastructure (V2I), Signal Phase and Timing (SPaT), and many yet to be legislated. These signals must be introduced in a manner that ensures validity and security to protect against DDoS attacks against these public networks. The introduction of Internet Services, Fleet Management, Insurance, and Telematics into this mobile data cloud also requires that firewalls and gateways are needed to ensure that the automobile is securely protected.

 

Automobiles already generate large streams of data from in-vehicle busses measured in gigabytes an hour. The additional radios, GPS, mobile carriers, infotainment systems, and self-driving systems will make data management and security to this new mobile cloud platform a must.

 

Some of the Challenges:

 

  • Cybersecurity Protection from DDoS, Man in the Middle, and other Internet Attacks
  • Real-time Validation and Insights from these Data Streams
  • Normalization of Standards and Protocols (CAN, LIN, I2C, WAVE, OpenXC, ASN.1…)
  • Extensible Design for Future Standards and Protocols (NHTSA)
  • Regulation Compliance

 

At Talksum we deliver a real-time scalable ecosystem that answers these concerns. Talksum is a Data Routing Engine that is deliverable as a Data Center solution, as a Virtual Router for Cloud deployment, as well as Embedded on select ARM, AARCH64, and customers’ boards. The Talksum product family provides a scalable and holistic ecosystem from the Dashboard to the Data Center.

 

We will be presenting this spring and summer at several conferences where will be discussing the Talksum Ecosystem. Until then follow this blog as next we will discuss the overlaps of the above networks and ecosystem requirements.

Internet of Things (IoT) to Reach $1.7 Trillion Market, Can the Data Center Handle It?

Barry Strauss, Talksum Head of MarketingBarry Strauss, Head of Marketing, Talksum

In a Wall Street Journal article published this summer entitled “Internet of Things Market to Reach $1.7 Trillion by 2020,” contributor Steven Norton cited IDC as proclaiming that the IoT market will grow from $665.8 billion last year to $1.7 trillion in 2020 as “more devices come online and a bevy of platforms and services grow with them.” Although security will bring a new wave of concerns, a possibly greater concern is the number of data centers that are not prepared for the massive amounts of data coming from all of the “things” in the near future.

The “things” constitute connected devices that are expected to grow from 10.3 billion last year to more than 29.5 billion in 2020 according to IDC. With those types of numbers, the data accelerating among the multitude of disparate devices needs to be processed and managed at wire speed and under common standards.

“Enterprises have to manage that, so they have to create new management policies for the devices and how they’re connected,” said Vernon Turner, IDC’s research fellow for the Internet of Things. “There is a life cycle that has to happen that might be different from the traditional application life cycle,” he said. Interoperability will also be a major sticking point when it comes to corporate adoption.

icon_telecomIoT-enabled devices bring processing challenges, which can be broken down into three areas – data ingestion, data storage, and data analytics.

The first two areas represent the cost of doing business while the third area – analytics – is seen as the value of Big Data, per se. According to a Forbes article entitled “The Internet of Things Will Radically Change Your Big Data Strategy,” contributor Mike Kavis said “Experts estimate that over half of all Big Data projects fail and most of those failures are due to projects never getting past the data ingestion phase.” In addition, Kavis stressed that even if an enterprise makes it past the data ingestion, it would still have to learn new technologies such as Hadoop, Map Reduce, and so on for provisioning enough disk, network, and compute capacity to keep up with the new incoming data. Finally, he mentions that analytics would be difficult since the IoT data would have problems integrating with existing data warehouse investments. And, he continued, “To make matters worse, the costs and effort to maintain and provision enough infrastructure to keep up with the incoming flow of data is an arduous task that continues to keep risks high throughout the life of the IoT investment.”

Worry no more.

The real-time Talksum Data Stream Router — TDSR — solves all of the challenges within all three areas.

First, let’s take a look at data ingestion. Today, the TDSR ingests, normalizes, and integrates most types of data originating from multiple sources. The highly configurable rack-mounted units can handle structured, semi-structured, and no-structure disparate data generating from any source over the network.

Secondly, the high-volume, high-performance TDSR keeps pace with all incoming data, processing millions of complex events per second, as it transforms, filters, data reduces, aggregates, enriches, analyzes and contextually routes the “actionable” data to any type of downstream system for storage, business intelligence, and/or database use. Data can quickly move where it is needed, in the format that is need, at the time it is needed. And, the TDSR is dynamically scalable to accommodate unpredictable data flow.

For the analytics area, the TDSR processes the data first as it comes in, then routes it to the appropriate analytics system for taking action. In other words, the TDSR transforms incoming disparate data from the various sources, allowing data to “talk” to each other, then routes the data to its respective downstream analytics tools for taking real-time action when needed and reduces reporting latency of critical events to seconds.

Costs are kept to a minimum since the TDSR comes highly configurable without the need for specialized coding to deploy highly tailored solutions. And the units include the foundational components for regulatory compliance, government standards, and policy control.

The Talksum data center solution streamlines service delivery and boosts overall performance – all with no impact on current infrastructure.

Click here for more information about Talksum solutions.

 

 

 

Solving Two of the Biggest Challenges of IoT: Data Lag and the Interoperability of Things

Barry Strauss, Talksum Head of MarketingBarry Strauss, Head of Marketing, Talksum

Processing large volumes of disparate data coming in at high speeds (aka Big Data) not only requires vast computing resources, it also takes a long time. The delay in time from when the data is received to when it turns into actionable insights causes financial losses, which can include costly infrastructure, down time, operational inefficiencies, disaster recovery costs, policy violations, security violations, costs of standardization, and lost opportunities, among others.

What is the source of the problem? Existing solutions first store the data, then make sense out of it when needed. By doing this, 80 percent of an analytics project typically involves data preparation for analysis when it is needed, leaving only 20 percent for actually performing the analysis. Data preparation includes items such as indexing, mapping, data reducing, organizing, and cleaning.

data_information_decision

Traditional approaches try to solve this by making storage bigger and faster (improving traditional databases and building new storage solutions such as Hadoop, in-memory, and others), and building better analytics on top of it. But this comes with complexity and implementation expenses, as well as scalability and stability concerns.

Enter the Talksum solution, which takes the approach of first understanding the data and acting upon it in real time before storing. The hardware-based solution allows enterprises to apply business logic early in the process before data is stored; optimizing what needs to be acted upon in real time and what needs to be routed to respective downstream sources.

This approach also allows enterprises to efficiently manage, distribute, and track real-time Internet of Things (IoT) data. The Talksum solution provides the “Interoperability of Things” for the Internet of Things. Talksum products – the Talksum Data Stream Router™ (TDSR™) and the Embedded TDSR (eTDSR™) – connect data sources coming from multiple systems in different schemas, protocols, and formats. The solution can also extract, transfer, and asynchronously load data to different storage systems, providing interoperability to the systems that would otherwise live in silos.

iot_tdsr_solution

 

The TDSR/eTDSR solution works at both remote sites (IoT, Smart Cities) and at the data center. Each can be independent of the other or can work together.

At remote locations, the eTDSR collects any type of sensor data, filters it, determines critical events, and contextually routes the relevant data to the appropriate services, including the data center.

At the data center, the TDSR retrieves data from the eTDSR and other sources for consumption by respective applications and data storage systems. It forwards relevant information back to the remote sites, to data centers, and to multiple domains, as applicable.

If you are interested in more information about the Talksum IoT solution and how it can help your organization, fill out our contact form and we will get back to you as soon as possible.

 

Click here for more information about Talksum products.

 

 

A Talksum Perspective – Making Big Data Implementations Successful

Alex VarshavskyAlex Varshavsky, CEO, Talksum

In a recent Capgemini Consulting report entitled Cracking the Data Conundrum: How Successful Companies Make Big Data Operational, it was noted that global organizations spent over $31 billion on Big Data in 2013 and the amount of spend is predicted to reach $114 billion in 2018. Nearly 60 percent of the executives surveyed for the report believe that Big Data will disrupt their industry in the next three years.

The report goes on to state, however, that even though these global enterprises embrace Big Data, only 13 percent have achieved full-scale production of their Big Data initiatives. This begs the question – what is keeping them from full-scale production and operational success?

The report revealed these challenges:

  • Scattered silos of data.
  • Ineffective coordination of analytics initiatives.
  • Lack of strong data management and governance mechanisms.
  • Dependence on legacy systems to process and analyze the data.

We’ll look at each of these challenges and how the Talksum data management solution can help overcome them.

 

Scattered Silos of Data

The report noted that 79 percent of those surveyed had not fully integrated their data sources across the organization. This lack of integration created barriers to a unified view of data that prevented decision-makers from making accurate and timely decisions. In the report, Filippo Passerini, CIO of US-based consumer products leader P&G, said “To move the business to a forward-looking view, we realized we needed one version of the truth. In the past, decision-makers spent time determining sources of the data or who had the most accurate data. This led to a lot of debate before real decisions could be made.”

The Talksum solution handles massive amounts of disparate data that originates from multiple sources. The single RU device ingests the data, sorts and filters it, and makes it useful for a holistic view of the data resources. This allows enterprises to apply business logic early in the process before the data is even stored, and optimizes what needs to be acted upon in real time and what needs to be routed to respective downstream sources. The Talksum Data Stream Router (TDSR) gives all applicable entities a holistic view of the information and breaks down silos that create barriers for knowledge, insights, and action.

 

Ineffective Coordination of Analytics Initiatives

The Capgemini report noted, “A major stumbling block is a lack of adequate coordination among analytics teams.” Scattered resources and decentralized teams break down best practices that can be shared among the groups. As Eric Spiegel, CEO of Siemens USA, puts it, “Leveraging Big Data often means working across functions like IT, engineering, finance, and procurement, and the ownership of data is fragmented across the organization.”

Different teams often have different analytics systems and BI tools. After the TDSR makes sense out of the incoming information, the TDSR then data reduces, enriches, analyzes, and contextually routes the data – in real time – to where it is supposed to go. Engineering receives the information relevant to them; finance receives the information relevant to them; and so on. In addition, all of the data can be stored for archival purposes in case it is needed later. If multiple organizations need the same data, it is delivered to both. Organizations receive data on a need-to-know basis.

 

Lack of Strong Data Management and Governance Mechanisms

Coordination and governance become significant when dealing with implementation challenges. Twenty-seven percent of those surveyed sited ineffective governance models for Big Data and analytics. And coordinating data at ingestion and output needs the right data management capabilities. Talksum is a leader in high-speed data processing and management solutions whose mission is to develop new ways of processing, routing, and managing Big Data. This is another way saying that we are a data management company – first and foremost. A significant benefit of the Talksum data management solution is built-in governance. The TDSR includes the foundational components for governance, regulatory compliance, government standards, and policy control.

 

Dependence on Legacy Systems to Process and Analyze the Data

Of those who responded to the Capgemini survey, 31 percent said their organization was dependent on legacy systems for data processing and management. It’s hard for these organizations to switch to newer systems for fear of incompatibility, of losing data, and of losing time.

The inventors of the Talksum solution have taken this into consideration and built it into the product. The TDSR is compatible with both legacy and newer systems. The unit slides into a server room rack without the need for extensive and complicated coding (it only requires some configuration settings). The unit can output data to both legacy and newer systems, including SharePoint, MySQL, AWS, MongoDB, Hadoop, in-memory options, and others.

 

Talksum Offers More Than Successful Big Data Implementation

To sum it up, the Talksum solution offers more than making Big Data implementations successful; the Talksum solution can also save enterprises up to 80 percent of what it costs to build and run a data center by providing the following:

  • Operational efficiency.
  • Systems interoperability.
  • Infrastructure footprint reduction.
  • Policy compliance.
  • Added security.

To learn more about these, contact us and we’ll send you more information.

Click here for more information about the Talksum product and its features.

 

Data Science as a Solution

Dale Russell, Talksum CTODale Russell, CTO, Talksum

As our understanding of data science problems evolves, we find that effective solutions apply a systematic approach to testing, measuring, and building knowledge of the whole data system. In order to effectively and efficiently create this holistic view of data, first consider the entirety of the data landscape from Infrastructure to Layer 7. A comprehensive data science solution should not have biased access to data from any one layer more than another. When architecting a solution, keep in mind that business requirements will change, message types and objects will change, and the volume of data from various OSI layers will change, especially as the Internet of Things (IoT) becomes more of a reality.

To best deal with an ever-changing data landscape, follow this important principle: Never leave work for a downstream process. Datasets will continue to grow in volume and diversity, and solutions will be expected to take less time to process data or make it actionable. Store-and-sort is a costly strategy regardless of who owns the infrastructure. We found the best approach is to sort first, then store.

Over the last 15 years, exceptional and innovative storage solutions have been developed utilizing distributed software and socket libraries and advanced cloud services. These come with substantial performance increases, benefiting data center environments where concerns about latency, growing storage, or increased demand for analytics on datasets arise. As innovations in this sector brings more data into your landscape, you can enable great data science by taking a broader approach.

While some solutions focus on a subset of problems, a great data science solution deals with the entirety of information across the data landscape. In working with our customers and partners, we found that any acceptable solution must not only accommodate changing data requirements, it must do so in a manner that maintains the highest level of data fidelity. If new analytical processes are created, the solution should easily direct the correct data streams to new processes without a lot of work for your team.

A proper data science solution empowers the organization to focus on asking forward-looking questions of their data, not requiring them to constantly invest time searching for new data solutions every time the data landscape changes (as it will continue to do).