Intelligent Transportation Ecosystem

IoT, ITS, and the Talksum Ecosystem

We want self-driving smart cars!

Dale Russell, CTO, Talksum, San Jose

 

We’ve been hearing a lot about of Internet of Things (IoT) for the last decade. IoT is coming, IoT is here, IoT and Big Data…

 

Many of the people talking about IoT are from the computer industry. They’re selling us on a new market valued around $1.7 Trillion in 2020. While convincing us to invest in the IoT sector they will then point to a repackaging of a computer or data center solution. You can buy several brands of set top-media players listed as an IoT device. Upon further inspection they’re a small computer with a HDMI connector and a wireless network card connecting to a web service. Smart watches are but Bluetooth devices for your smart phone; a computer accessory. We have gone from Internet Hosting, Web Services, Cloud Services, and now IoT by  just a release of smaller computers and distributed web services?

 

That’s beginning to change and the consumers will be driving the market so to speak. We want self-driving smart cars! Automobile manufactures old and new are all working on developing self-driving modules, most governments want Intelligent Transportation Systems (ITS), and we as the consumer want gadgets!

 

What gadget is cooler than self-driving cars?

 

The consumer’s desire is to also bring infotainment and entertainment, social groups, smart devices, as well as personal data contracts and other services along for the ride. When you combine all the wants of the consumer we have the beginning of the Digital Transportation Eratm. We want self driving cars, traffic re-routing for first responders, reduced traffic congestion, quieter roads, and cleaner air. With all of these wonderful goals the automobile has become the consumers’ mobile data center bringing all of the same security concerns as an internet data center and many more. Automobile manufacturers, government and regulatory bodies, insurance and telematics services, and internet media providers all bring their own protocols into this new mobile data center. The new Smart Car will need to be considered part of the larger ITS and IoT clouds.

Intelligent Transportation Ecosystem

To meet the challenges of the Digital Transportation Eratm the vehicle needs to be an integral part of these new ecosystems. The Intelligent Transportation Systems will include new networks, data management, and new standards and protocols such as: Vehicle to Vehicle (V2V), Vehicle to Infrastructure (V2I), Signal Phase and Timing (SPaT), and many yet to be legislated. These signals must be introduced in a manner that ensures validity and security to protect against DDoS attacks against these public networks. The introduction of Internet Services, Fleet Management, Insurance, and Telematics into this mobile data cloud also requires that firewalls and gateways are needed to ensure that the automobile is securely protected.

 

Automobiles already generate large streams of data from in-vehicle busses measured in gigabytes an hour. The additional radios, GPS, mobile carriers, infotainment systems, and self-driving systems will make data management and security to this new mobile cloud platform a must.

 

Some of the Challenges:

 

  • Cybersecurity Protection from DDoS, Man in the Middle, and other Internet Attacks
  • Real-time Validation and Insights from these Data Streams
  • Normalization of Standards and Protocols (CAN, LIN, I2C, WAVE, OpenXC, ASN.1…)
  • Extensible Design for Future Standards and Protocols (NHTSA)
  • Regulation Compliance

 

At Talksum we deliver a real-time scalable ecosystem that answers these concerns. Talksum is a Data Routing Engine that is deliverable as a Data Center solution, as a Virtual Router for Cloud deployment, as well as Embedded on select ARM, AARCH64, and customers’ boards. The Talksum product family provides a scalable and holistic ecosystem from the Dashboard to the Data Center.

 

We will be presenting this spring and summer at several conferences where will be discussing the Talksum Ecosystem. Until then follow this blog as next we will discuss the overlaps of the above networks and ecosystem requirements.

A Talksum Perspective – Understanding Data Early in the Process

Barry Strauss, Talksum Head of MarketingBarry Strauss, Head of Marketing, Talksum

With the amount of data increasing at exponential rates, you would think that businesses today would have it made, that they would be overwhelmed with information, and that they could apply information to problems on the spot, react instantaneously, and even be more proactive than ever. Unfortunately, this is most often not the case.

The inordinately massive amount of available information has ironically put a strain on the enterprise, more specifically on the network infrastructure. Big Data that gets bottlenecked and can’t be served properly becomes responsible for down time, operational inefficiencies, potential disasters, policy violations, lost opportunities, higher costs, security violations, and other problems.

Ultimately, the ineffective infrastructure triggers financial losses that occur because of the inability to make real-time decisions with the data.

The traditional approach to fixing this problem is to first store data, and then to make something out of it. The focus of innovation is to make storage bigger and faster (improve traditional databases and build new storage solutions such as Hadoop, in-memory databases, and others) and build analytics on top of it. This becomes complex and expensive to implement and also raises concerns in scalability and stability.

Talksum has tackled this problem with an efficient approach – understanding data and then acting upon it in real time before storing. This allows enterprises to apply business logic early in the process before data is stored, and optimize what needs to be acted upon in real time and what needs to be routed to respective downstream sources.

Again, the key point of the Talksum approach is to understand the data first before it is stored, as opposed to today’s approach of storing first, and then trying to figure out what is in the data.

At the end of the day, it all comes down to saving $$$$ and eliminating financial losses. The underlying, innovative approach built into the Talksum product allows a single Talksum Data Stream Router (TDSR) to replace racks of servers in the data center. This drastically reduces the infrastructure footprint, staffing and support needs, energy consumption, the number of required software licenses, and other costs. At the same time, the TDSR reduces the number of steps and the amount of time (from days or weeks to seconds and minutes) needed to make timely decisions, create reports, build charts, and take appropriate actions.

The Talksum solution also helps data centers solve one of their biggest challenges – dealing with multiple logging formats and data schemas – by providing a universal logging profiler, as well as other systems interoperability problems.

The TDSR performs all of this while enforcing security and policy compliance.

To sum it up, the TDSR reduces many solutions that apply to challenges and problems to a single solution that covers all. And it does this in real time at network speeds.

 

Click here for more information about the Talksum product and its features.

Data Science as a Solution

Dale Russell, Talksum CTODale Russell, CTO, Talksum

As our understanding of data science problems evolves, we find that effective solutions apply a systematic approach to testing, measuring, and building knowledge of the whole data system. In order to effectively and efficiently create this holistic view of data, first consider the entirety of the data landscape from Infrastructure to Layer 7. A comprehensive data science solution should not have biased access to data from any one layer more than another. When architecting a solution, keep in mind that business requirements will change, message types and objects will change, and the volume of data from various OSI layers will change, especially as the Internet of Things (IoT) becomes more of a reality.

To best deal with an ever-changing data landscape, follow this important principle: Never leave work for a downstream process. Datasets will continue to grow in volume and diversity, and solutions will be expected to take less time to process data or make it actionable. Store-and-sort is a costly strategy regardless of who owns the infrastructure. We found the best approach is to sort first, then store.

Over the last 15 years, exceptional and innovative storage solutions have been developed utilizing distributed software and socket libraries and advanced cloud services. These come with substantial performance increases, benefiting data center environments where concerns about latency, growing storage, or increased demand for analytics on datasets arise. As innovations in this sector brings more data into your landscape, you can enable great data science by taking a broader approach.

While some solutions focus on a subset of problems, a great data science solution deals with the entirety of information across the data landscape. In working with our customers and partners, we found that any acceptable solution must not only accommodate changing data requirements, it must do so in a manner that maintains the highest level of data fidelity. If new analytical processes are created, the solution should easily direct the correct data streams to new processes without a lot of work for your team.

A proper data science solution empowers the organization to focus on asking forward-looking questions of their data, not requiring them to constantly invest time searching for new data solutions every time the data landscape changes (as it will continue to do).

 

 

Talksum Overview, or How the TDSR Works Like a Brain

Barry Strauss, Talksum Head of MarketingBarry Strauss, Head of Marketing, Talksum

The Talksum Data Stream Router™ (TDSR™) offers a solution for Big Data initiatives that aim to deal with large amounts of disparate data types in real time. The TDSR works like a brain — it ingests massive amounts of information, then filters and contextually routes it where it is needed and at the time it is need.

Let’s take a look at how the brain works. The brain handles about 400 billion bits of information each second and would fill its capacity, in theory, in about three hours. To handle this, the brain selects only about 2,000 events to use.

When applied to technology, data centers also need a way to handle the massive amounts of Big Data. Everything is first stored, and then we try to make sense out of it. The focus of innovation has been to make storage larger and faster. But with the amount of data growing exponentially, new approaches are necessary.

That’s where the brain and the TDSR come into play. Just like the brain, the Talksum Data Stream Router filters incoming data and optimizes the data management process, making it easy to monitor, analyze, and contextually route information in real time.

Click here for more information about the Talksum product and its features.

How the TDSR Can Help the Feds Save up to $32.5 Billion Annually

Barry Strauss, Talksum Head of MarketingBarry Strauss, Head of Marketing, Talksum

In a new MeriTalk report entitled “The Drive to Thrive: Ensuring the Agile Data Center,” Federal field workers and data center leads stressed the need for instant information access to do their jobs while at the same time pointing out the negative affects of downtime on both productivity and the bottom line.

The report stated that real-time access could save the federal government $32.5 billion annually. The feds field workers noted that real-time information access saves them an average of 17 hours per week, or 816 hours per year, and multiplied by the number of field workers in the U.S. government, this translates into about $32.5 billion in annual productivity savings.

Talksum Can Help the Feds Save Up to $32.5 Billion AnnuallyIn the last month, however, respondents noted that 70 percent of agencies have experienced downtime of 30 minutes or more. According to the report, 90 percent of the respondents said downtime affects their ability to do their job, and 42 percent said they couldn’t support their agency’s core mission. In fact, less than one-fifth of Federal IT professionals are fully confident in their agency’s ability to meet up-time and fail-over requirements.

In addition, 80 percent of Federal IT professionals cite data center reliability as a top priority for their agency.

The real-time Talksum Data Stream Router, or TDSR, has been built with this in mind. Talksum takes a highly efficient approach to data processing, management, and analytics for secure data center solutions. The TDSR improves data acquisition and transformation, converts data into flexibly managed event streams, and provides actionable data and reduced reporting latency of critical events to seconds – perfect for any data center and data center infrastructure management (DCIM) system.

The TDSR processes, manages, and contextually routes millions of events per section to power real-time monitoring and alerts, preventing downtime, while also freeing up storage through its data reduction technology. The TDSR can instantly send alerts for memory errors, queries, non-normal activities, SLA blips, electrical capacity, unauthorized login attempts, system statistics, and a host of other types of alerts, custom-tailored per agency requirements.

The hardware-based, highly configurable TDSR, can be easily deployed for highly specialized solutions without the need for specialized coding, and includes the foundational components for regulatory compliance, government standards, and policy control.

The Talksum solution offers security controls that span firewalls, intrusion detection systems, anti-virus and anti-malware systems, network devices, server hosts, applications, physical systems, and more to eliminate the problems before they happen.

More information about Talksum solutions.