Recent research from technology industry trade association CompTIA suggests that enterprises looking to maximize the value of their data need to implement new practices. According to the Trends in Data Management report, many organizations are currently working with a blank slate when it comes to data optimization simply because advanced data practices haven’t been positioned as a top priority.
Building a strategy
Data value is in the eye of the beholder, stated Andy Neill, practice lead at Info-Tech Research Group, an IT research firm. “If you want to see value, the first step is making the data practically available to the business,” he explained in an interview. “This includes discoverability, classification, standard semantics, and accessibility.”
Neill observed that an important first step toward optimizing data value is to begin thinking about data markets and how data may be shared through independent data platforms. “Changing the culture to ‘open data first,’ and exposing data wherever possible for the greater good, is our responsibility,” he said.
Richard Clegg, a data analyst for SPR, an enterprise technology consulting firm, recommended creating a framework that incorporates both ethics and data stewardship. “When an organization is lagging on data optimization, it often helps to look inward at some of the fundamental challenges of your knowledge management,” he advised.
Today’s technology challenges are largely driven by obsolete data management and optimization models, observed Zhamak Dehghani, a principal consultant at software consulting firm ThoughtWorks. Updating an organization’s data optimization culture and decision-making necessitates both grassroot and top-down changes. “It also requires creating completely new incentive structures, KPIs, and/or objectives and key results (OKRs) that are tailored toward achieving outcomes using data and AI,” she explained. “For example, superficial KPIs, such as the number of datasets in the warehouse or lake, or the amount of data migrated to the company’s next-generation data platform or cloud, are not going to drive a change to become data-driven.”
One of the biggest obstacles to data optimization is a lack of experts. “You can’t optimize data without a clear understanding of that data,” advised David Linthicum, chief cloud strategy officer for professional services firm Deloitte. “This means database, storage, and analytics SMEs.”
Organizations can’t simply expect experts to arrive on their doorstep. “They need to work proactively to create career development paths for generalist engineers to become familiar with modern data management technologies and to have opportunities to grow these skills through application in the organization,” Dehghani said. She noted that organizations should make data management knowledge more easily available to IT generalists. “This can be done through abstraction of complex data infrastructures with self-serve high-level data platforms, and by utilizing openly-accessible technologies instead of proprietary ones,” she explained.
Ashish Verma, managing director and analytics and information management lead at Deloitte Consulting, suggested establishing a learning and development curriculum. Such a program, he noted, should be focused on data optimization and related technologies in the form of online instruction, podcasts, webinars, and on-site boot camps, external training, and hands-on experience. “Hands-on experience, to learn the latest techniques on [the] data delivery lifecycle.”
There are two levels of training, Neill said. “The first is for the general business, to enable them to discover and utilize data,” he noted. “The second is for IT and power users who need to know how to implement data products and services through standard policy, procedure and technology.”
Blockchain and digital ledger technologies are creating new opportunities for data optimization, providing the ability to exchange data with external sources securely and with verifiable authority. “Managing digital assets with blockchain as a source of authority could result in more effective ways for organizations to gather and distribute data,” Clegg observed.
Linthicum agreed. “The ability to do peer-to-peer transaction validation will open up a world of data points that will finally allow automation,” he noted. “Supply chains will benefit most from this.” he said.
Blockchain and other digital ledger technologies can also play an important role in assuring data quality and data integrity — key data optimization goals. “In blockchain, the dataset keeps that integrity due to the data going through a verification process which ensures quality,” Verma explained. Blockchain technologies can also be used to store insights and data analysis in situations where project teams aren’t required to repeat data analysis that has already been carried out by other teams. Blockchain can also prevent the reuse of data that has been already employed.
Verma suggested that blockchain is poised to emerge as a key technology for managing data traveling to and from IoT devices operating in decentralized architecture environments. “Blockchain technologies can ensure [IoT] data integrity/quality … for artificial intelligence and machine learning processing,” he said.
Read more articles on data management and optimization:
John Edwards is a veteran business technology journalist. His work has appeared in The New York Times, The Washington Post, and numerous business and technology publications, including Computerworld, CFO Magazine, IBM Data Management Magazine, RFID Journal, and Electronic … View Full Bio