home
Boardroom
Hewlett-Packard Company


In The Boardroom With...

Ms. Diana Zavala
Director, Analytics and Data Management
HP Enterprise Services, U.S. Public Sector

SecuritySolutionsWatch.com: Thank you for joining us today, Diana. Before discussing Big Data trends, challenges and opportunities, can you please take a moment to tell us a little about your background?

Diana Zavala: I've been in the world of technology for 30 years and currently lead the HP Analytics and Data Management Practice for U.S. Public Sector. We define, plan and deliver unique information solutions to support the mission/business objectives of U.S. government clients by providing actionable information to optimize their enterprise. My background includes consulting and extensive experience delivering business technology enablement solutions - from strategy development through implementation and management, in both commercial and public sector industries.

SecuritySolutionsWatch.com: We read with great interest in HP's business white paper "Data Driven Government" that "Big Data, as a concept, is nothing new to government. Over the last two decades, as more systems of record (such as financials and human resource management, document and records management, and the like) are computerized, and more front-end systems of engagement (such as citizen-services portals and single, non-emergency number services and tax filings) are moving online, global public sector organizations are increasingly dealing with exponential growth in data that's essential to the services they provide. But by and large they are not fully taking advantage of the potential value this data offers for making government more efficient, smarter and focused on best serving the public." Can you please share your opinion on this matter?

Diana Zavala: You've mentioned the explosion of information, and we know that in our personal lives, as citizens and professionals within the IT industry, that information will continue to multiply. So, too, is there the growing expectation that government and citizens will be able to take advantage of the wealth of information now available. Today, citizens demand the same responsiveness and convenience from their government as they do from private industry.

While there is still much more to be done in terms of integrating data across ecosystems, there are a number of forces driving governments and citizens to better exploit information. The objectives of Open Data/Open Government, citizen-centric services and common services platforms for data virtualization and integration, are elevating the importance of using our data-rich environment to impact outcomes for public good.

For example, often citizens and businesses are asked to submit the same information multiple times to different governmental agencies. This results in a significant duplication of effort, as well as potential errors, which can cause agencies to deny benefits to those entitled. To help address this issue, HP designed and built a state-of-the-art platform that enables once-only data collection. Citizens are never asked for the same information twice, thereby making the effort required to interact with these agencies, minimal. One resulting benefit lies in the college study grants application process. Instead of filling out multiple paper forms and locating specific information from multiple government agencies, students can now complete the process online electronically, just once.

Furthermore, information has traditionally been siloed. Point solutions solve difficult problems; however, connected as-a-service platforms are needed to achieve the level of service that citizens expect, in a cost-effective manner. Optimization of information is a journey and as data continues to multiply, there will be new tools precipitating a greater expectation to create actionable outcomes that improve efficiencies and help government personnel make more informed business decisions.

SecuritySolutionsWatch.com: Today, we are witnessing a data explosion to be sure, and Big Data appears to offer the potential of delivering the big picture, but only if an enterprise has access to 100 percent of its structured and unstructured data, and only if that data can be analyzed for deeper meaning and insight, and then acted upon. What are your thoughts on this?

Diana Zavala: Today, information is complex, available in real time and mostly unstructured. An enterprise must have the ability to analyze unstructured and structured information in order to gain deeper insights. Modernizing traditional business intelligence platforms by adding the ability to process unstructured or "human" information - social media, email, voice, video, documents, archives, etc. - provides an added layer of intelligence. In addition, an enterprise needs to consider its ability to embed analytics into business processes to streamline actions. This might entail pairing together social media information with identity access information in order to form conceptual analytics around possible insider threats or to help form a decision on maintaining security clearances.

SecuritySolutionsWatch.com: Let's turn for a moment to HP's Big Data solution and strategic roadmap HAVEn, a platform comprised of software, services, and hardware that analyzes 100 percent of the data relevant to your organization. Please walk us through what HAVEn stands for, its components and what it does for an enterprise.

Diana Zavala: HAVEn is an acronym for our industry-leading Big Data platform. - A compilation of the Hadoop, Autonomy, Vertica and Enterprise security engines, which enable "n" (number) of applications, HP HAVEn is an open, scalable solution that provides the ability to manage, transform and analyze the full spectrum of structured, semi-structured and unstructured data. The individual components consist of:
  • Hadoop - An open source, cost-effective capability to store massive amounts of data from virtually any source, Hadoop allows for the distributed processing of large data sets across clusters of computers using new programming models. It effectively allows the storage of any type of information and enables the ability to access that data at any time to perform analytics.

    All of the HP HAVEn engines, including Autonomy, Vertica and Arc Sight, are able to interact with Hadoop for data collection and analysis. In addition to the engines, HP delivers an ecosystem with enterprise-strength features to help you harness Hadoop where appropriate.

    For example, Hadoop is well-suited for storing and cataloging large amounts of semi-structured data (such as logs) and unstructured data (such as audio, video and email). For other high-value data, HP customers typically rely on the real-time engines of HAVEn to store data in an optimal format for analysis - meaning that HAVEn processes data up to 100 times faster than the batch-oriented data processing of Hadoop.

    HP and Hortonworks recently announced a strategic partnership to address the critical Big Data needs of enterprise customers. The joint commitment will help accelerate the adoption of Enterprise Apache Hadoop by deeply integrating the Hortonworks Data Platform with the HP HAVEn Big Data platform and is supported by a $50 million equity investment by HP.

    As part of the partnership, HP and Hortonworks have committed to integrate their engineering strategies and deepen their existing go-to-market collaboration, enabling HP customers to deploy the Hortonworks Data Platform as the Hadoop component of HP HAVEn. HP will also work to certify HP Vertica with Apache Hadoop YARN, the architectural center of Hadoop 2.0.

  • Autonomy - HP Autonomy's Intelligent Data Operating Layer (IDOL) automates the process of recognizing, categorizing and retrieving concepts and meaning in unstructured human information, which falls into two categories:
  • Unstructured text data - which includes content in blogs, email, news feeds, documents and social media interactions.
  • Unstructured rich media - which includes photos, videos, sound files and forms of information that do not include text beyond simple metadata.


IDOL forms a conceptual and contextual understanding of all content in an enterprise - automatically analyzing any piece of information from more than 1,000 different content formats. IDOL can perform more than 500 operations on digital content, and these functions are available to build rich analytic applications for the meaningful exploration of human data.

  • HP Vertica - This powerful, highly scalable analytics engine drives down the cost of capturing, storing and analyzing data, while producing answers faster than traditional data warehouse technology. Vertica provides an iterative, conversational approach to analytics and its high-performance capability allows organizations faster insight into their data by running queries 50 to 1,000 times faster than legacy products. Vertica is massively scalable and is able to run on an unlimited number of industry standard servers. Its open architecture allows enterprises to leverage current tools, as it supports Hadoop, R and a range of business intelligence tools. Vertica has optimized data storage, storing 10 to 30 times more data per server than row databases with patented columnar compression.

  • The HP ArcSight Logger unifies searching, reporting, alerting and analysis across any type of enterprise log and machine data. It is unique in its ability to collect, analyze and store massive amounts of machine data generated by modern networks. ArcSight Logger collects data from any device, in any format, from more than 300 distinct log-generating sources and allows the user to filter and parse it with rich metadata. With today's focus on security, continuous diagnostics and mitigation, ArcSight allows you to integrate and analyze machine data with other sources - something that is now critical to evaluating the overall risk and security posture of an enterprise.

SecuritySolutionsWatch.com: The Internet of Things (IoT) has created a business intelligence bounty and Big Data parameters are evolving and ever changing. Volume is just one part of the challenge… there are in fact three other V's at play here: variety, velocity and vulnerability. Please share your thoughts with us about this.

Diana Zavala: With the Internet of Things, there is a wealth of information to address and parameters will continue to evolve as the world becomes more connected via relationships, networks and sensors. Parameters will also continue to evolve. What is important is that you have the strategies, skills, organizational culture and tools to extrapolate insight from your data regardless of these parameters. This means that fundamentally you still have to acquire, store, secure and analyze the information.

To manage Big Data effectively, you must be able to:
  • Gather, store and manage large amounts of data - up to millions of times more data than what you handle today (volume).
  • Collect and store all relevant data, structured, semi-structured and unstructured (variety).
  • Analyze and react to it as quickly as it's created (velocity).
  • Keep it private, secure and compliant with regulatory requirements at all times (vulnerability).


However, you also need to be able to filter this information so that you are utilizing the relevant data that will provide business or mission VALUE to your organization and ultimately, your citizens. Ask yourself, do you have relevant data that is buried in your archives? Are you storing information that is no longer useful and merely increasing cost? Having a well-thought-out strategy and approach on how to optimize your information will help you deal with the growing body of data.

SecuritySolutionsWatch.com: Diana, what's your perspective on the opportunities facing government with regards to Big Data?

Diana Zavala: The opportunities for government to harness the value of data continue to expand as new sources of data become available. One particular example is the work we are doing with the UK's Norfolk County Council, which sought to understand the indicators of a family in need. The use of data enabled the government to implement an integrated information hub to address a 360-degree view of its constituents and the services that can be extend to them in a timely and efficient manner to improve citizen support.

Additionally, the use of new types of data and the ability to handle volumes of information at high speed makes it possible to detect and inhibit Fraud and Abuse in medical, disability and unemployment claims faster and more efficiently. Constantly-shifting location data from RFID tags can be used to track in real time, millions of pieces of inventory. You can use social media intelligence to monitor and spot at-risk populations for bullying or human trafficking are all possible through the appropriate analysis of Big Data.

Another example of efficiencies gained through the collection and informed use of Big Data is the HP Healthcare Analytics solution which uses sophisticated pattern-matching techniques and probabilistic modeling to understand structured, semi-structured and unstructured digital information. It helps speed the chart review process for clinical use by physicians and healthcare analysts, and provides chart abstraction for quality reporting and supports population health and research activities. By providing a thorough analysis of a patient's medical records, HP Healthcare Analytics enables caregivers to provide a higher quality of care. For instance, potential complications can be identified while a patient is still in the hospital. Additionally, by looking at trends across the community, it can help identify outbreaks so that preventive measures can be established. And, ultimately, it lowers costs by automating manual processes and helping physicians get to the right diagnosis sooner.

SecuritySolutionsWatch.com: One of your colleagues at HP, Ed Keegan, recently discussed Continuous Monitoring with us and how it represents the maturation of information technology and how the automated, continuous auditing of the cyber environment can help ensure proper security control configuration and proactive IT monitoring for threats and vulnerabilities. Others use the term "Continuous Evaluation" in parallel. May we have your views on this topic?

Diana Zavala: Continuous Evaluation is an extension of Identity and Access Management. In the DoD environment, Continuous Evaluation is about assessing eligibility for access to classified information in between an initial background investigation and a periodic reinvestigation (PR) or, during the career of a cleared employee, in between PRs. HP is delivering advanced, data-driven security technologies to derive meaningful security intelligence from Big Data. This includes the ability to look at structured and unstructured information together in an integrated view, in order to make rapid decisions in a dynamic environment. Today's advanced persistent threat environment requires enterprise to leverage real-time data so it can fulfill its mission, keep citizens safe and make efficient IT investment and resource decisions.

SecuritySolutionsWatch.com: Are there any other topics you'd like to discuss?

Diana Zavala: We, governments and their citizens - are in a new era of accelerated innovation based on new data sources. In previous decades, government organizations made decisions by analyzing their transactional business data and using this historic view, along with the insights from key personnel, to make daily decisions and projections about the future. With the rise of technologies and new platforms in social media, and cloud and mobility, governments and citizens are coming together in new ways to engage and utilize new types of data. The ability to harness and create insights from the wealth of data available, drives the ability to enhance operational efficiency, make actionable decisions and better serve citizens and constituents.