Data management

The data lifecycle

Data Management comprises all disciplines related to managing data as a valuable resource.

Concept

The concept of data management arose in the 1980s as technology moved from sequential processing (first cards, then tape) to random access storage. Since it was now possible to store a discreet fact and quickly access it using random access disk technology, those suggesting that data management was more important than business process management used arguments such as "a customer's home address is stored in 75 (or some other large number) places in our computer systems." However, during this period, random access processing was not competitively fast, so those suggesting "process management" was more important than "data management" used batch processing time as their primary argument. As software applications evolved into real-time, interactive usage, it became obvious that both management processes were important. If the data was not well defined, the data would be mis-used in applications. If the process wasn't well defined, it was impossible to meet user needs.

Topics in Data Management

Topics in data management include:

Usage

In modern management usage, the term data is increasingly replaced by information or even knowledge in a non-technical context. Thus data management has become information management or knowledge management. This trend obscures the raw data processing and renders interpretation implicit. The distinction between data and derived value is illustrated by the information ladder. However, data has staged a comeback with the popularisation of the term Big_data, which refers to the collection and analyses of massive sets of data.

Several organisations have established data management centers (DMC)[1] for their operations.

Integrated data management

Integrated data management (IDM) is a tools approach to facilitate data management and improve performance. IDM consists of an integrated, modular environment to manage enterprise application data, and optimize data-driven applications over its lifetime.[2][3][4][5] IDM's purpose is to:

  • Produce enterprise-ready applications faster
  • Improve data access, speed iterative testing
  • Empower collaboration between architects, developers and DBAs
  • Consistently achieve service level targets
  • Automate and simplify operations
  • Provide contextual intelligence across the solution stack
  • Support business growth
  • Accommodate new initiatives without expanding infrastructure
  • Simplify application upgrades, consolidation and retirement
  • Facilitate alignment, consistency and governance
  • Define business policies and standards up front; share, extend, and apply throughout the lifecycle

Data Management Frameworks

A Data Management Framework (DMF) is a system of thinking, terminology, documentation, resources and insights which allows users to view data related concepts and information in their own context, and in the broader context of the framework, thereby enabling them to integrate their conversations and work.

There are a number of DMFs available.

William Richard Evans, of South Africa, has developed three Fully Integrated Data Management Frameworks: The Data Atom Data Management Framework version 1.0 was developed between 2010 and 2014. Version 2.0 was developed between 2014 and 2017. With the advent of artificial intelligence, the Internet of Things and data lakes, version 2.0 was replaced with the more comprehensive Multi Dimensional Data Management Framework V3.0. It covers 20 data management disciplines and 7 data environments.[6]

Organizations

The definition provided by DAMA International, the professional organization for the data management profession, is: "Data Management is the development and execution of architectures, policies, practices and procedures that properly manage the full data life-cycle needs of an enterprise." This broad definition encompasses professions which may not have direct technical contact with lower-level aspects of Data Management, such as relational database management.

Alternatively, the definition in the DAMA International[7] Data Management Body of Knowledge (DAMA DMBOK)[8] is: "Data management is the development, execution and supervision of plans, policies, programs and practices that control, protect, deliver and enhance the value of data and information assets."

Corporate data quality management (CDQM) is, according to the European Foundation for Quality Management and the Competence Center Corporate Data Quality (CC CDQ, University of St. Gallen), the whole set of activities intended to improve corporate data quality (both reactive and preventive). Main premise of CDQM is the business relevance of high-quality corporate data. CDQM comprises with following activity areas:.[9]

  • Strategy for Corporate Data Quality: As CDQM is affected by various business drivers and requires involvement of multiple divisions in an organization; it must be considered a company-wide endeavor.
  • Corporate Data Quality Controlling: Effective CDQM requires compliance with standards, policies, and procedures. Compliance is monitored according to previously defined metrics and performance indicators and reported to stakeholders.
  • Corporate Data Quality Organization: CDQM requires clear roles and responsibilities for the use of corporate data. The CDQM organization defines tasks and privileges for decision making for CDQM.
  • Corporate Data Quality Processes and Methods: In order to handle corporate data properly and in a standardized way across the entire organization and to ensure corporate data quality, standard procedures and guidelines must be embedded in company’s daily processes.
  • Data Architecture for Corporate Data Quality: The data architecture consists of the data object model - which comprises the unambiguous definition and the conceptual model of corporate data - and the data storage and distribution architecture.
  • Applications for Corporate Data Quality: Software applications support the activities of Corporate Data Quality Management. Their use must be planned, monitored, managed and continuously improved.

The DAMA Guide to the Data Management Body of Knowledge (DAMA-DMBOK Guide) was published in 2009, and a second edition was published in 2017.

See also

References

  1. For example: Kumar, Sangeeth; Ramesh, Maneesha Vinodini (2010). "Lightweight Management framework (LMF) for a Heterogeneous Wireless Network for Landslide Detection". In Meghanathan, Natarajan; Boumerdassi, Selma; Chaki, Nabendu; Nagamalai, Dhinaharan. Recent Trends in Networks and Communications: International Conferences, NeCoM 2010, WiMoN 2010, WeST 2010,Chennai, India, July 23-25, 2010. Proceedings. Communications in Computer and Information Science. 90. Springer. p. 466. ISBN 9783642144936. Retrieved 2016-06-16. 4.4 Data Management Center (DMC)[:] The Data Management Center is the data center for all of the deployed cluster networks. Through the DMC, the LMF allows the user to list the services in any cluster member belonging to any cluster [...].
  2. Integrated Data Management: Managing data across its lifecycle by Holly Hayes
  3. Organizations thrive on Data by Eric Naiburg
  4. Fragmented Management Across The Data Life Cycle Increases Cost And Risk - A commissioned study conducted by Forrester Consulting on behalf of IBM October 2008
  5. integrated IBM Data Management information center
  6. "The Multi Dimensional Data Management Framework V3.0)".
  7. EFQM ; IWI-HSG: EFQM Framework for Corporate Data Quality Management. Brussels : EFQM Press, 2011
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.