Menu
Log in
WILD APRICOT
TEACHERS ASSOCIATION

Log in

News & Announcements

Stay on top of all DAMA-RMC news and announcements here.

<< First  < Prev   1   2   3   4   5   ...   Next >  Last >> 
  • 09/11/2024 7:00 AM | Anonymous member (Administrator)


    In any organization, certain data is required across business areas, processes, and systems. The overall organization and its customers benefit if this data is shared and all business units can access the same customer lists, geographic location codes, business unit lists, delivery options, part lists, accounting cost center codes, governmental tax codes, and other data used to run the business. People using data generally assume a level of consistency exists across the organization, until they see disparate data.

    In most organizations, systems and data evolve more organically than data management professionals would like. Particularly in large organizations, various projects and initiatives, mergers and acquisitions, and other business activities result in multiple systems executing essentially the same functions, isolated from each other. These conditions inevitably lead to inconsistencies in data structure and data values between systems. This variability increases costs and risks. Both can be reduced through the management of Master Data and Reference Data.

  • 09/04/2024 7:00 AM | Anonymous member (Administrator)

    Documents, records, and other unstructured content represent risk to an organization. Managing this risk and getting value from this information both require governance. Drivers include:

    • Legal and regulatory compliance
    • Defensible disposition of records
    • Proactive preparation for e-discovery
    • Security of sensitive information
    • Management of risk areas such as email and Big Data

    Principles of successful Information Governance programs are emerging. One set of principles is the ARMA GARP® principles (see Section 1.2). Other principles include:

    • Assign executive sponsorship for accountability
    • Educate employees on information governance responsibilities
    • Classify information under the correct record code or taxonomy category
    • Ensure authenticity and integrity of information
    • Determine that the official record is electronic unless specified differently
    • Develop policies for alignment of business systems and third-parties to information governance standards
    • Store, manage, make accessible, monitor, and audit approved enterprise repositories and systems for records and content
    • Secure confidential or personally identifiable information
    • Control unnecessary growth of information
    • Dispose information when it reaches the end of its lifecycle
    • Comply with requests for information (e.g., discovery, subpoena, etc.)
    • Improve continuously

    The Information Governance Reference Model (IGRM) shows the relationship of Information Governance to other organizational functions. The outer ring includes the stakeholders who put policies, standards, processes, tools and infrastructure in place to manage information. The center shows a lifecycle diagram with each lifecycle component within the color or colors of the stakeholder(s) who executes that component. The IGRM complements ARMA’s GARP®.

    Sponsorship by someone close to or within the ‘C’ suite is a critical requirement for the formation and sustainability of the Information Governance program. A cross-functional senior level Information Council or Steering Committee is established that meets on a regular basis. The Council is responsible for an enterprise Information Governance strategy, operating procedures, guidance on technology and standards, communications and training, monitoring, and funding. Information Governance policies are written for the stakeholder areas, and then ideally technology is applied for enforcement. 

  • 08/28/2024 7:00 AM | Anonymous member (Administrator)


    Discovery is a legal term that refers to pre-trial phase of a lawsuit where both parties request information from each other to find facts for the case and to see how strong the arguments are on either side. The US Federal Rules of Civil Procedure (FRCP) have governed the discovery of evidence in lawsuits and other civil cases since 1938. For decades, paper-based discovery rules were applied to e-discovery. In 2006, amendments to the FRCP accommodated the discovery practice and requirements of ESI in the litigation process.

    Other global regulations have requirements specific to the ability of an organization to produce electronic evidence. Examples include the UK Bribery Act, Dodd-Frank Act, Foreign Account Tax Compliance Act (FATCA), Foreign Corrupt Practices Act, EU Data Protection Regulations and Rules, global anti-trust regulations, sector-specific regulations, and local court procedural rules.

    Electronic documents usually have Metadata (which may not be available for paper documents) that plays an important part in evidence. Legal requirements come from the key legal processes such as e-discovery, as well as data and records retention practices, the legal hold notification (LHN) process, and legally defensible disposition practices. LHN includes identifying information that may be requested in a legal proceeding, locking that data or document down to prevent editing or deletion, and then notifying all parties in an organization that the data or document in question is subject to a legal hold.

    This figure depicts a high-level Electronic Discovery Reference Model developed by EDRM, a standards and guidelines organization for e-discovery. This framework provides an approach to e-discovery that is handy for  people involved in identifying how and where the relevant internal data is stored, what retention policies apply, what data is not accessible, and what tools are available to assist in the identification process.  

    The EDRM model assumes that data or information governance is in place. The model includes eight e-discovery phases that can be iterative. As e-discovery progresses, the volume of discoverable data and information is greatly reduced as their relevance is greatly increased.

    The first phase, Identification, has two sub-phases: Early Case Assessment and Early Data Assessment (not depicted in the diagram). In Early Case Assessment, the legal case itself is assessed for pertinent information, called descriptive information or Metadata (e.g., keywords, date ranges, etc.). In Early Data Assessment, the types and location of data relevant to the case is assessed. Data assessment should identify policies related to the retention or destruction of relevant data so that ESI can be preserved. Interviews should be held with records management personnel, data custodians or data owners, and information technology personnel to obtain pertinent information. In addition, the involved personnel need to understand the case background, legal hold, and their role in the litigation.

    The next phases in the model are the Preservation and Collection. Preservation ensures that the data that has been identified as potentially relevant is placed in a legal hold so it is not destroyed. Collection includes the acquisition and transfer of identified data from the company to their legal counsel in a legally defensible manner.

    During the Processing phase data is de-duplicated, searched, and analyzed to determine which data items will move forward to the Review phase. In the Review phase, documents are identified to be presented in response to the request. Review also identifies privileged documents that will be withheld. Much of the selection depends on Metadata associated with the documents. Processing takes place after the Review phase because it addresses content analysis to understand the circumstances, facts and potential evidence in litigation or investigation and to enhance the search and review processes.

    Processing and Review depend on analysis, but Analysis is called out as a separate phase with a focus on content. The goal of content analysis is to understand the circumstances, facts, and potential evidence in litigation or investigation, in order to formulate a strategy in response to the legal situation.

    In the Production phase, data and information are turned over to opposing counsel, based on agreed-to specifications. Original sources of information may be files, spreadsheets, email, databases, drawings, photographs, data from proprietary applications, website data, voicemail, and much more. The ESI can be collected, processed and output to a variety of formats. Native production retains the original format of the files. Near-native production alters the original format through extraction and conversion. ESI can be produced in an image, or near paper, format. Fielded data is Metadata and other information extracted from native files when ESI is processed and produced in a text-delimited file or XML load file. The lineage of the materials provided during the Production phase is important, because no one wants to be accused of altering data or information provided.

    Displaying the ESI at depositions, hearings, and trials is part of the Presentation phase. The ESI exhibits can be presented in paper, near paper, near-native and native formats to support or refute elements of the case. They may be used to elicit further information, validate existing facts or positions, or persuade an audience.

  • 08/21/2024 7:00 AM | Anonymous member (Administrator)


    Document management includes records management. Managing records has special requirements. Records management includes the full lifecycle: from record creation or receipt through processing, distribution, organization, and retrieval, to disposition. Records can be physical (e.g., documents, memos, contracts, reports or microfiche); electronic (e.g., email content, attachments, and instant messaging); content on a website; documents on all types of media and hardware; and data captured in databases of all kinds. Hybrid records, such as aperture cards (paper record with a microfiche window imbedded with details or supporting material), combine formats. A Vital Record is type a record required to resume an organization’s operations the event of a disaster.

    Trustworthy records are important not only for record keeping but also for regulatory compliance. Having signatures on the record contributes to a record’s integrity. Other integrity actions include verification of the event (i.e., witnessing in real time) and double-checking the information after the event.

    Well-prepared records have characteristics such as:

    • Content: Content must be accurate, complete and truthful.
    • Context: Descriptive information (Metadata) about the record’s creator, date of creation, or relationship to other records should be collected, structured and maintained with the record at the time of record creation.
    • Timeliness: A record should be created promptly after the event, action or decision occurs.
    • Permanency: Once they are designated as records, records cannot be changed for the legal length of their existence.
    • Structure: The appearance and arrangement of a record’s content should be clear. They should be recorded on the correct forms or templates. Content should be legible, terminology should be used consistently.

      Many records exist in both electronic and paper formats. Records Management requires the organization to know which copy (electronic or paper) is the official ‘copy of record’ to meet record keeping obligations. Once the copy of record is determined, the other copy can be safely destroyed.

  • 08/14/2024 9:31 AM | Anonymous member (Administrator)


    Exciting updates are coming to the Certified Data Management Professional (CDMP) certification exams starting in October 2024. The exams will now cover the DAMA-DMBOK2 Revised Edition, an updated and improved version of the Data Management Body of Knowledge. This revised edition addresses known inconsistencies and inaccuracies, making it a more reliable and comprehensive resource for data management professionals. For those preparing for the CDMP certification, it's important to note that this revised edition will be the new reference material.

    The DAMA-DMBOK2 Revised Edition has been created to ensure the content remains relevant and accurate for data management practitioners. DAMA International embarked on this update to improve upon the previous version, incorporating feedback from members and volunteers. The revised edition aims to provide a more consistent and precise framework, making it easier for professionals to understand and apply the principles of data management.

    Key improvements in the DAMA-DMBOK2 Revised Edition include standardized terminology and acronyms, corrections of typos and errors, improvements to context diagrams, and enhancements to the Data Quality chapter. These updates ensure the content is not only accurate but also easier to comprehend and apply. The revised edition is available for purchase (and discounted for Professional Members) through DAMA-RMC at DMBoK.

    For those currently preparing for the CDMP certification, you can continue using the DAMA-DMBOK 2nd Edition until October 2024. After that date, the revised edition will be the authoritative resource.

    We encourage you to join one of our upcoming 30-minute informational sessions on the CDMP Study Group during the week of August 26th. You can also enroll in our 12-week virtual study program which will start in September. To enroll in the study program, you must be a Professional Member anyone can attend the informational sessions. We also have a self-paced study option available.

    Information session links:

    For more information, please contact jhorner@dama-rockymountainchapter.org or visit the CDMP Webpage.  Additionally, please update your DAMA RMC profile to indicate your interest in the CDMP so we can keep you update to date on all things DMBoK and CDMP.

  • 08/14/2024 9:22 AM | Anonymous member (Administrator)

    We are pleased to invite you to a concise and informative session on the Certified Data Management Professional (CDMP) certification. This 30-minute meeting will cover essential topics and recent updates, ensuring you are well-prepared for the certification process.

    The study sessions will run from the week of September 9th - December (Meeting time and day of week TBD tentatively planning for Wednesday at 6pm)

    At the conclusion we will host to ½ day virtual prep reviews and host the annual pay if you pass event which allows attendees to only pay the exam fee if they pass.

    **Agenda:**

    1. **CDMP Exam Overview**

    • Description of the CDMP certification levels and exam structure.
    • Key topics and study resources.

    2. **Study Session Logistics**

    • Registration, scheduling, and common logistical questions
    • Format

    3. **New Changes to DMBOK / CDMP Exams**

    • Overview of recent updates to the Data Management Body of Knowledge (DMBOK) and changes in CDMP exam content.

    4. **General Q&A**

    **Date and Time:**

    We are offering two 30 minute information sessions:

    Please RSVP to ProfessionalDevelopmentVP@damarmc.org as soon as possible to confirm your attendance. We look forward to your participation and engaging discussions.

  • 08/11/2024 11:25 AM | Anonymous member (Administrator)


    Document and Content Management entails controlling the capture, storage, access, and use of data and information stored outside relational databases. Its focus is on maintaining the integrity of and enabling access to documents and other unstructured or semi-structured information which makes it roughly equivalent to data operations management for relational databases. However, it also has strategic drivers. In many organizations, unstructured data has a direct relationship to structured data. Management decisions about such content should be applied consistently. In addition, as are other types of data, documents and unstructured content are expected to be secure and of high quality. Ensuring security and quality requires governance, reliable architecture, and well-managed Metadata.

    The primary business drivers for document and content management include regulatory compliance, the ability to respond to litigation and e-discovery requests, and business continuity requirements. Good records management can also help organizations become more efficient. Well-organized, searchable websites that result from effective management of ontologies and other structures that facilitate searching help improve customer and employee satisfaction.

    Laws and regulations require that organizations maintain records of certain kinds of activities. Most organizations also have policies, standards, and best practices for record keeping. Records include both paper documents and electronically stored information (ESI). Good records management is necessary for business continuity. It also enables an organization to respond in the case of litigation.

    E-discovery is the process of finding electronic records that might serve as evidence in a legal action. As the technology for creating, storing, and using data has developed, the volume of ESI has increased exponentially. Some of this data will undoubtedly end up in litigation or regulatory requests.

    The ability of an organization to respond to an e-discovery request depends on how proactively it has managed records such as email, chats, websites, and electronic documents, as well as raw application data and Metadata. Big Data has become a driver for more efficient e-discovery, records retention, and strong information governance.

    Gaining efficiencies is a driver for improving document management. Technological advances in document management are helping organizations streamline processes, manage workflow, eliminate repetitive manual tasks, and enable collaboration. These technologies have the additional benefits of enabling people to locate, access, and share documents more quickly. They can also prevent documents from being lost. This is very important for e-discovery. Money is also saved by freeing up file cabinet space and reducing document handling costs.

  • 08/05/2024 2:06 PM | Anonymous member (Administrator)


    An Enterprise Service Bus (ESB) is a system that acts as an intermediary between systems, passing messages between them. Applications can send and receive messages or files using the ESB, and are encapsulated from other processes existing on the ESB. An example of loose coupling, the ESB acts as the service between the applications.

  • 07/31/2024 7:00 AM | Anonymous member (Administrator)


    Coupling describes the degree to which two systems are entwined. Two systems that are tightly coupled usually have a synchronous interface, where one system waits for a response from the other. Tight coupling represents a riskier operation: if one system is unavailable then they are both effectively unavailable, and the business continuity plan for both have to be the same.

    Where possible, loose coupling is a preferred interface design, where data is passed between systems without waiting for a response and one system may be unavailable without causing the other to be unavailable. Loose coupling can be implemented using various techniques with services, APIs, or message queues. This figure illustrates a possible loose coupling design.

    Service Oriented Architecture using an Enterprise Service Bus is an example of a loosely coupled data interaction design pattern.

    Where the systems are loosely coupled, replacement of systems in the application inventory can theoretically be performed without rewriting the systems with which they interact, because the interaction points are well-defined.

<< First  < Prev   1   2   3   4   5   ...   Next >  Last >> 


Featured Articles

Featured articles coming soon!

Not a member yet?
Join us now

Quick links

Follow our activities

© DAMA-RMC 2022

Powered by Wild Apricot Membership Software