New Mexico Consortium and Los Alamos Labs Big Data Management Workshop

Five Minutes With Alex VarshavskyAlex Varshavsky, CEO, Talksum

Last week, we had the pleasure of leading a workshop for the New Mexico Consortium (NMC) and the Los Alamos National Laboratory (LANL) about Big Data management at the data center level.

At the workshop, we discussed how Big Data initiatives need more than just new storage platforms and BI solutions, and how they need new approaches to data architectures and management strategies. The reason for these needs can be broken down into following:

  • Operate in real time to keep pace with velocity.
  • Adaptive to meet changing requirements.
  • Simple to use to avoid specialized skills and custom code.
  • Low overhead in terms of people, time, and infrastructure.

Enterprises are overloaded with data. In fact, as highlighted in a Marcia Conner blog post, every day, more than 2.5 quintillion bytes of data (1 followed by 18 zeros) are created, with 90 percent of the world’s data created in the last two years alone. As a society, we are producing and capturing more data every day than was seen by everyone since the beginning of earth.

Most of this data filters through data centers, and along with the data come Big Data problems, such as:

  • How do you reasonably ingest, transform, route, and analyze data in real time?
  • How can you apply more logic earlier in the pipeline, while minimizing ingest performance impact?
  • How can you begin to create a holistic view of the information in the data so that you can correlate events from multiple domains?

These and other problems were covered with use cases, and we followed up with solutions to those use cases. We were excited to show how the Talksum Data Stream Router™ (TDSR™) handles the above and showed examples using their own data. The TDSR can ingest multiple transport and application protocols, as well as multiple formats, and convert the incoming data into parallel data streams, or events messages, that are aggregated, filtered, and then contextually routed to their respective end points (summarized data, aggregated data, dynamic streams, data stores, data warehouses, log storage, and so on).

It was a fruitful workshop and we hope to present at many more.

 

New Development Facility in Silicon Valley!

Alex VarshavskyAlex Varshavsky, CEO, Talksum

Yesterday, we announced the opening of our new development facility in Campbell, California, to accommodate growth based upon our market forecast and the current demand for Big Data products such as the Talksum Data Stream Router (TDSR).

newBuilding_exterior_smallThe new facility, which was originally the first home of Apple Computer, is more than four times the size of the last facility and will allow us to ramp up more easily with demand for our products. In addition, the Silicon Valley location is ripe with all levels of technologists, developers, and vendors.

With that in mind, we are currently seeking top-notch software and systems engineers. Take a look at the current Talksum job openings and let us know if you qualify, or feel free to forward the link to your friends.

We are planning to use the larger facility to grow the engineering group and increase system administration functionality.

 

 

 

 

 

 

 

 

Driving Oil and Gas Exploration, Production Efficiency Through Harmonized Data

Dale Russell, Talksum CTODale Russell, CTO, Talksum

We just got back from the Big Data Analytics for Oil & Gas Forum that took place in Houston, TX, where Talksum was asked to participate as both a speaker and as the conference chair. I want share insights and expand on the event’s theme – driving oil and gas exploration and production efficiency through harmonized data.

Speakers from Shell, Riss Energy, Halliburton Landmark Software & Services, Hess Corporation, and others reinforced the topic of harmonized data. Subjects ranged from cross-unit and cross-domain performance and data management, data accuracy, cloud computing, seismological processes, and case studies on how to harness the 2.7 zettabytes of data within the oil and gas digital universe

The forum started with a keynote from CIO Patrick McGinty, Riss Energy, who looked “ahead” at energy in the Big Data era. He delved into the strategies and tools that are changing the landscape of the energy industry. Patrick ended the session with an analysis of emerging trends in predictive analytics and cloud computing and exploration of the future of the industry under new technologies

Dwayne Williams, UNC Wells HSE – West Supervisor for Shell Oil Company, gave a case study that followed real-time data – from the field to corporate analytics. During the session, Dwayne showed how Shell improved field productivity and reduced risks using mobile devices and mobile form apps to improve health, safety, security, and environment (HSSE) data collection, and how they connected this data to the corporate analytics system.

Day 1 ended with my session on cross-domain data management for the oil and gas sector. In the session, I focused on the importance of ingesting different types of data from multiple data sets, including those from the well site, remote units, regional sites, headquarters, and at the real-time operation center (RTOC), and being able to format – or harmonize – the disparate data in real time, as well as enrich the data by correlating events with other external data sources. In a cross-domain scenario, the ingested data would “talk” to each other, and relevant parts would be intelligently routed to the appropriate real-time BI tools, dashboards, and respective databases at the well site, remote units, RTOC, and so on, for immediate action. In addition, pertinent parts of the data flow would be streamed to other cross domains such as health services, insurance companies, and the like, to minimize latency.”

On day 2, an interesting session was given by Riss Energy CEO Srik Soogoor, who discussed how cloud computing and Big Data are revolutionizing exploration and drilling. He looked at strategies and tools that are changing the landscape of the energy industry, at analyses of emerging trends in predictive analytics, and at the future of the industry under new and proposed government regulations.

The Big Data Analytics for Gas & Oil Forum brought with it an informative two days of introductions, workshops, presentations, discussions, and insights into Big Data analytics for oil and gas – highlighted by the need for harmonized data to drive oil and gas exploration and production efficiency.

 

Talksum Gets Ready for the Big Data Oil & Gas Forum, Announces Solution for Oil and Gas Exploration and Production

Alex VarshavskyAlex Varshavsky, CEO, Talksum

Talksum announces solution for oil and gas exploration and productionAs I described in an earlier blog post, Talksum has been invited to participate at the Big Data Analytics for Oil & Gas Forum in Houston, TX, which will take place tomorrow, October 17-18, 2013. We will be speaking at the forum and sharing our knowledge about cross-domain data management, how it works in the oil and gas sector, and why it’s so important for the industry in terms of efficiency, security, and safety. Since my last blog, a couple of important events have occurred.

First, the event organizers have asked that we not only speak about cross-domain data management and provide demonstrations in our booth, but they’ve also asked us – since my last blog posting – if we would like to chair the event. We enthusiastically said “yes.” To accommodate this, our CTO Dale Russell will also attend the forum with me. We will be talking about a new approach to working with data across oil and gas domains, and how to handle, process, and intelligently route the massive amounts of data in real time to appropriate users at the well site, remote units, RTOC, and so on, for immediate action. In addition, we will also explain how pertinent parts of the data flow could – in parallel – be streamed to other cross domains such as health services, insurance companies, and the like.

Secondly, we have posted a press release today that complements our event participation and officially announces our cross-domain data management solution for oil and gas exploration and production. You can view the entire Talksum oil and gas solution press release here.

If you are in the Houston area, please let us know so that we can set up a face-to-face meeting with you during the event. If you cannot make it to this forum, fill out our contact form as appropriate and we will contact you as soon as possible.

 

The Importance of Cross-Domain Data Management for Oil and Gas Exploration and Production

Alex VarshavskyAlex Varshavsky, CEO, Talksum

Next Thursday, October 17-18, 2013, Talksum will be participating at the Big Data Analytics for Oil & Gas Forum in Houston, TX. Top minds in oil and gas data analytics, exploration, production, and data management will focus on cross-division efficiency, data accuracy, data management, and other relevant and timely topics, and how to bridge gaps and look closely at the sector’s biggest asset – data. I’m excited to be a speaker at the forum and to share our knowledge about cross-domain data management for oil and gas, how it works in the oil and gas sector, and why it’s so important for the industry.

The oil and gas exploration and production industry is huge and relies on many players for success. If logs at the well and sensor information aren’t processed timely and don’t reach the targeted audiences at the right time, billions of dollars would be lost and catastrophic results could occur.

To give an idea of the magnitude of the oil and gas sector, as well as the importance of cross-domain data management, we can look at a few numbers:

  • In the North Sea, a single 6 km well can cost $100 million to drill.
  • Non-productive time, which can be caused by a myriad of factors including bad weather, operator errors, leaks, pressure problems, faulty parts, and so on, can be as high as 30 percent.
  • With thousands of offshore rigs in the world and an average daily rental rate exceeding $150,000 per rig, a few percent cost reduction across the offshore fleet would translate into savings of billions of dollars per year.

These are big numbers that require massive amounts of data among many domains including exploration, extraction, navigation, distribution, communications, health, services, and refinery control to enable safe environments with efficient results. Cross-domain data management enables data to:

  • Cross barriers of proprietary software and systems.
  • “Talk” to each other in real time at the right locations when needed.
  • Detect “abnormal” events in real time before a catastrophic event occurs.
  • Comply with industry and government standards, as well as ever-changing regulations and policies.
  • Be safe, secure, and protected from cyber attacks.

I’m looking forward to the event and will post interesting insights in a future blog. If you would like more details on the Talksum solution for oil and gas, please fill out our contact form and one of our team members will be happy to oblige.