3D Data Creation to Curation: Community Standards for 3D Data Preservation collects the efforts of the Community Standards for 3D Data Preservation (CS3DP) initiative–a large practicing community of librarians, researchers, engineers, and designers–to move toward establishment of shared guidelines, practices, and standards.
Using a collaborative approach for standards development that promotes individual investment and broad adoption, this group has produced a work that captures the shared preservation needs of the whole community.
Chapters cover best practices for 3D data preservation, management, metadata, legal issues, and access. Beginning with surveys of current practices, the authors provide recommendations for implementing standards and identify areas in which further development is required. A glossary of key terms and acronyms is included for easy reference. 3D Data Creation to Curation is intended for a broad audience from 3D data novices to seasoned practitioners, as well as those who may not be involved in the creation of the data but are tasked with curating, migrating, and sustaining access to these data long-term.
The Archaeological Data Export Standard (proper spelling Archäologischer DateneXport, abbreviated ADeX) is a data exchange format developed by a commission of state archaeologists, with the help of which essential attributes of archaeological sites (e.g. localisation, dating, description of features, etc.) can be exchanged in a standardised way between institutions of archaeological monument preservation and persons and institutions of archaeological research (students, research institutes, museums, etc.) in Germany. Source: Wikipedia with modifications (cc by-sa 4.0)
Additional information can be found at:
The completed questionnaire of the Archaeology Data Service for certification as CoreTrustSeal certified data repository provides a good orientation for the certification process of an archaeological repository.
Further completed questionnaires from certified repositories can be found on the CoreTrustSeal website.
The Getty Research Institute’s Art & Architecture Thesaurus (AAT) is a structured vocabulary for describing and indexing the visual arts and architecture. It is used worldwide to facilitate standardised recording and searching in architectural, art and historical collections. The AAT contains terms, descriptions and other metadata for generic concepts related to art, architecture, conservation, archaeology and other cultural heritage.
The Data Archiving Guide (DAG) is designed to support the work of employees of data repositories by providing a general understanding of the full range of activities a data repository performs. Although the DAG was developed for and by employees of social science data archives, a lot of information applies to other archives as well and is equally useful for archiving professionals from other disciplines. The Data Archiving Guide has been created for CESSDA ERIC by a number of its service providers’ experts.
Chapter 5.4 “Trustworthy data archives” in particular provides a good overview of the relationship between FAIR principles and the concept of trustworthiness, and how these qualities are reflected within data archives via policies, technology, internal training, and dissemination. Chapter 1.5 “What is a certified archive” contains information on the certification process for trustworthy digital repositories (TDRs).
Chapter 1: Data Archives – a Quick Introduction
Chapter 2: Policies of Data Archives
Chapter 3: Pre-ingest
Chapter 4: Ingest and Curation
Chapter 5: FAIR-enabling and Trustworthy Qualities of Data Archives
CoreTrustSeal is an international, community based, non-governmental, and non-profit organization promoting sustainable and trustworthy data infrastructures. The CoreTrustSeal Requirements describe the characteristics required to be a trustworthy repository for digital data and metadata. Each Requirement is accompanied by Guidance text describing the response statements and evidence that applicants must provide to enable an objective review. Applicants must respond to all of the Requirements.
CSVW is a standard for describing and clarifying the content of CSV tables.
The CSVW syntax specification describes a data model for tabular data. That is to say, it defines that a table is a collection of cells arranged into columns and rows. It also describes how to annotate a table with a metadata file to help processors parse and interpret the contents.
The schema for the metadata file is provided by the CSVW metadata vocabulary. The vocabulary defines the properties that can be used in an annotation. This includes things like a schema of column descriptions, datatypes and foreign key relations.
(Quelle)
Document resulting from the work of the Working Group “Archaeology-Thesauri” of the Commission “Archaeology and Information Systems” of the Association of State Archaeologies.
Rainer Göldner (Landesamt für Archäologie Sachsen) proposes a data structure that maps chronological terms to time concepts (timeline). The dating scheme allows a supra-regional description of chronological terms, taking into account hierarchies, time spans and chronological sequences. This means that different chronology systems can be recorded in parallel and used for comprehensive analyses. The schema is explained and critically discussed on the basis of an application example.
In appendices, the data schema is applied to the “Timeline” of the Working Group Thesaurus, which lists term systems or thesauri used locally in various German federal states for archaeological dating.
A summary and translation of the CARE Principles of the Global Indigenous Data Alliance (GIDA) into German by the joint project EcoDM founded by the Federal Ministry of Education and Research (BMBF)
With this White Paper, which gathers contributions from more than 25 experts of 3D imaging, modelling and processing, as well as professionals concerned with the interoperability and sustainability of research data,the PARTHENOS project aims at laying the foundations of a comprehensive environment centred on the researchers’ practices concerning 3D digital objects. The topics addressed in the document are meant to help to ensure the development of standardized good practices relating to the production, the handling, the long-term conservation and the reuse of 3D objects. Therefore, even if the focus is put on technical questions (formats, processing, and annotation), the White Paper also identifies the need to clarify the legal status of 3D objects, in order to facilitate their reuse(s) in non-research contexts, in particular in Museums.
The Digital Preservation Handbook, first compiled by Neil Beagrie and Maggie Jones in 2001, is maintained and updated by the Digital Preservation Coalition. This full revision (the 2nd Edition) has expanded and updated content to cover over 30 major sections.
The 2nd edition was compiled with input from 45 practitioners and experts in digital preservation under the direction of Neil Beagrie as managing editor and William Kilbride as chair of the Management and Advisory Boards.
The Handbook provides an internationally authoritative and practical guide to the subject of managing digital resources over time and the issues in sustaining access to them. It will be of interest to all those involved in the creation and management of digital materials.
This course provided students with a space to develop their digital literacy in archaeological contexts. We used a combination of lecture, discussions, and computer work to engage students in ways to create, manage, combine, and reuse archaeological data, while also confronting emerging issues in the field related to the increased use of digital technologies: data accuracy, data ethics, ownership, open data, public engagement, ‘big data.’ There were no expectations of a coding or data analysis background - this course was about exploration and experimentation. Students were not expected to emerge from this course with a complete knowledge of how to perform complex analyses on digital datasets – rather, this course instilled an underlying framework for how to think about archaeology and the data that archaeologists produce. It fostered an understanding of the principles that are beginning to guide archaeological research in the 21st century. As social science disciplines increasingly move towards reliance on digital datasets for recording, research, accessibility, and archiving, it is now incumbent upon developing scholars to be familiar with the ways in which digital data can used, and how they are impacting research practices. During the 2020 Spring semester, a course was taught at the University of Wisconsin-Milwaukee in the Department of Anthropology on digital archaeology and data reuse. This GitHub repo contains the syllabus for this course, which you should feel free to use, modify, or pick and choose. In the Class Activities section in this README you will find links to other repos that contain class activities that you can also use. Please note that this course was designed for undergraduate and graduate archaeology students with NO experience in these digital tools, so the activities are pretty basic.
EDTF defines features to be supported in a date/time string, features considered useful for a wide variety of applications.
The Extended Date/Time Format (EDTF) was created by the Library of Congress with the participation and support of the bibliographic community as well as communities with related interests.
Date and time formats are specified in ISO 8601, the International Standard for the representation of dates and times. ISO 8601-2004 provided basic date and time formats; these were not sufficiently expressive to support various semantic qualifiers and concepts than many applications find useful. For example, although it could express the concept “the year 1984”, it could not express “approximately the year 1984”, or “we think the year is 1984 but we’re not certain”. These as well as various other concepts had therefore often been represented using ad hoc conventions; EDTF provides a standard syntax for their representation.
Further, 8601 is a complex specification describing a large number of date/time formats, in many cases providing multiple options for a given format. Thus a second aim of EDTF is to restrict the supported formats to a smaller set.
EDTF functionality has now been integrated into ISO 8601-2019, the latest revision of ISO 8601, published in March 2019.
Local Contexts is a global initiative that supports Indigenous communities with tools that can reassert cultural authority in heritage collections and data. The Notices are tools for institutions and researchers to identify Indigenous collections and data and recognize Indigenous rights and interests. The Notices were developed to create pathways for partnership, collaboration, and support of Indigenous cultural authority. Notices can be applied to websites, publications, datasets, museum exhibitions, items in a collection, genetic samples, and more. On this webpage Local Contexts describes the purpose, different types and right use of these notices.
This article deals with the certification of repositories and the questions that data centres considering certification of their institution have to deal with. The article focuses on an experience report on the attainment of the CoreTrustSeal (CTS) by the Research Data Centre at the Institute for Quality Development in Education (FDZ at the IQB). The motivation for the certification process, the criteria and subject areas covered by the CTS and the practical implementation of the certification project are described. For each step, it is attempted to derive recommendations (lessons learned) that could be helpful for other data centres wishing to obtain the CTS.
Preface
This guide aims to help UK Higher Education Institutions aid their researchers in making informed choices about what research data to keep. The content complements other DCC guides: How to Appraise & Select Research Data for Curation,1 and How to Develop Research Data Management Services.2 The guide will be relevant to researchers making decisions on a project-by-project basis, or formulating departmental guidelines. It assumes that decisions on particular datasets will normally be made by researchers with advice from the appropriate staff (e.g. academic liaison librarians) taking into account any institutional policy on Research Data Management (RDM) and guidance available within their own domain. As such, the guide should also be relevant to staff with responsibility for defining such policy in a Higher Education Institution, a Professional or Learned Society or similar disciplinary body.
The guide assumes that part way through their research the Principal Investigator, or other researcher responsible for data management, will want to choose what data to keep, informed by commitments already made to share or retain data (e.g. in a Data Management Plan) . The unit of appraisal is a ‘data collection’ and this may include different files carrying different access permissions and/or licence conditions.
The text also assumes that the institution will provide the following capabilities:
No assumption is made about how either of the above capabilities will be provided; for example, they might be repository or managed storage services, distinct from or integrated with a publications repository or a CRIS (Current Research Information System). In either case the capability could be provided in-house, or outsourced e.g. through Janet Cloud Services.3 The guide may be adapted to reflect local services and guidance on selecting external repositories for data deposition.4 DCC can provide help with this customisation to institutions’ needs and visual design.5
Whyte A. and Wilson A. (2010) How to Appraise & Select Research Data for Curation. DCC How-to Guides. Edinburgh: Digital Curation Centre. Available online: www.dcc.ac.uk/resources/how-guides ↩
Jones, S., Pryor, G. & Whyte, A. (2013). How to Develop Research Data Management Services - a guide for HEIs. DCC How-to Guides. Edinburgh: Digital Curation Centre. Available online: www.dcc.ac.uk/resources/how-guides ↩
Details of Janet Cloud Services available at: www.ja.net/products-services/janet-cloud-services ↩
A DCC Checklist for Evaluating Data Repository Services will be available 2014 [Version 1.1 of the checklist is now available] ↩
Further information at www.dcc.ac.uk/tailored-support, or contact info@dcc.ac.uk ↩
XML template for the Research Data Management Organiser developed by the FODAKO project in accordance with the DFG checklist and the guidelines for handling research data of the DFG Review Board 101 “Ancient Cultures”.
The Research Data Management Organiser, RDMO for short, is a tool for creating data management plans. It was developed as part of a DFG-funded project and is now maintained by an RDMO working group in which anyone can participate*.
Many research institutions and universities maintain productive or test instances of the RDMO software. An overview of the institutions and contact persons can be found on the project homepage in the “Cooperation network” section.
Please note: In the same GitHub repository there is also a template of the University Library FAU - FAU Erlangen-Nuremberg for the DFG Review Board 101 “Ancient Cultures” available.
Geonames.org is a geographic database that contains an extensive collection of information on over eleven million place names - including historical ones - from around the world. These toponyms identify geographical locations such as countries, cities, streets, rivers, lakes, mountains and islands and serve as standardised data thanks to the persistent, unique identifiers. GeoNames.org thus contributes to the standardisation of place names, helps to avoid misunderstandings and uncertainties and enables precise communication about geographical locations.
GeoNames is licensed under a Creative Commons licence and is accessible free of charge via a range of web services and a daily database export. The database contains over 25 million geographic names and consists of over 12 million unique entries, including 4.8 million populated places and 16 million alternative names. All features are categorised into one of nine feature classes and further categorised into one of 645 feature codes.
GeoNames integrates geographic data such as place names in different languages, elevations, populations and other data from various sources. All latitude/longitude coordinates are in WGS84 (World Geodetic System 1984). Users can manually edit, correct and add new names via a user-friendly wiki interface.
GeoNames maintains its own ontology (documentation) to add geographical semantic information to the World Wide Web. All of the more than 11 million GeoNames toponyms have a unique URL with a corresponding RDF web service. Other services describe the relation between toponyms.
The OGC GeoSPARQL standard supports representing and querying geospatial data on the Semantic Web. GeoSPARQL defines a vocabulary for representing geospatial data in RDF, and it defines an extension to the SPARQL query language for processing geospatial data. In addition, GeoSPARQL is designed to accommodate systems based on qualitative spatial reasoning and systems based on quantitative spatial computations.
The GeoSPARQL standard is actively maintained. To follow its progress, visit the public Github repository located here.
This guideline is dedicated to the topic of “digital long-term archiving” and is aimed at researchers in the NFDI4Culture community and infrastructure facilities. It conveys the basics of long-term archiving and presents the offerings of NFDI4Culture. In addition, the steps of moving materials into a digital long-term archive are explained using examples and the OAIS model is presented. Another section is dedicated to the different strategies of a digital long term archive. In addition to clarifications of terms, the extensive appendix offers a detailed list of references. (Source)
In Chapter 2.2 Trustworthiness of digital long-term archiving, the authors discuss the TRUST principles in more detail. In addition to the PDF publication stored here, a web version is also available at this link.
This resource is only available in German.
Ressource available in German only
Due to the increasing interdisciplinarity, the promotion of the open science concept and, in particular, the requirements of third-party funding bodies for the projects they fund, professional, science-related advice on handling research data has become necessary in order to successfully acquire third-party funding and conduct research in line with good scientific practice. Many research institutions have recognized this, which is why research data management (RDM) consulting services in particular have increased significantly in the scientific environment in recent years. The development and establishment of such services depends on various factors such as the equipment of RDM advisory services, the respective institutional composition of the advisory team, the (subject-specific) competencies of the institution and the RDM needs landscape.
For the purpose of planning, implementing and professionalizing RDM consultation services, a group of authors formed in 2023 as part of a GO UNITE! workshop to develop concepts for conducting and recording RDM consultation meetings and developed this handout on consultation meetings in research data management.
Source (translated): Helling, P., Lemaire, M., Asef, E., Assmann, C., Christ, A., Engelhardt, C., Herwig, A., Kellendonk, S., Mertzen, D., Thaut, A., Wiljes, C., & Zollitsch, L. (2024). Handreichung für die Beratung im Forschungsdatenmanagement. Zenodo. https://doi.org/10.5281/zenodo.13684373, CC BY 4.0
This handbook was written and edited by a group of about 40 collaborators in a series of six book sprints that took place between 1 and 10 June 2021. It aims to support higher education institutions with the practical implementation of content relating to the FAIR principles in their curricula, while also aiding teaching by providing practical material, such as competence profiles, learning outcomes, lesson plans, and supporting information. It incorporates community feedback received during the public consultation which ran from 27 July to 12 September 2021.
In contrast to static two-dimensional images, three-dimensional representations of objects can be viewed, scaled and rotated from any direction. A point in a 3D model is described by its position on the x-, y- and z-axis of a Cartesian coordinate system, whereby the z-axis in this context usually indicates the depth, more rarely the height.
Virtual reality refers to digital three-dimensional worlds that can be interacted with in real time.
3D content can be created in different ways: by manual modeling, such as reconstructions of buildings, by recording, such as a 3D scan of objects, or by automated calculation from photos, such as photogrammetry or structure from motion. Further details for the documentation that go beyond those given here depend on the method of creation. Additional information can be found in the respective sections in the chapter on research methods, whereby the sections on building research, geodesy, geodata analysis and material recording are of particular interest. Furthermore, the results of the 3D ICONS project provide extensive information on the documentation of 3D recording methods and subsequent processing. (Source)
This resource is unfortunately only available in German.
Recommandations for Geodesy
Recommendations for Georeferencing
When dealing with digital data, different requirements and conditions apply to each phase of a research project. Since research data is subject to a life cycle, decisions and work steps taken in a particular phase also have an impact on the other phases of the data cycle. When designing and planning a new research project, it is important to consider what information already exists digitally, what types of new files need to be created, what information technologies should be used and how the management of research data will be organized. (Source)
This resource is unfortunately only available in German.
Chapter overview
2.1 Data management
2.1.1 Overview of the tasks in the project phases
2.1.2 Data management plan
2.1.3 Further information on data management
2.2 Documentation
2.2.1 Documentation with metadata
2.2.2 Structuring metadata
2.2.3 Controlled vocabularies, thesauri and standard data
2.2.4 Storing metadata
2.2.5 Metadata in the application
2.2.6 Excavation documentation
2.2.7 Further information
2.3 File management
2.3.1 File storage
2.3.2 Recommendations for a folder structure
2.3.3 File naming
2.3.4 Version control
2.4 File storage and backup
2.4.1 Short-term storage
2.4.2 Medium-term backup
IANUS IT-Recommendations for Websites
ICC color management meets the goal of creating, promoting and encouraging the standardization of an open, vendor-neutral, cross-platform color management system architecture and components. While the current architecture works well in many areas, new potential applications are emerging and it is believed that tomorrow’s color comunication will require a more flexible and extensible system. ICC has developed a new specification, iccMAX, that will address many of these new requirements. iccMAX is also published by ISO as ISO 20677.
Please note: Core part of iccMAX are the Interoperability Conformance Specifications (ICS). They specify the requirements for an iccMAX profile sub-class for a particular use case, such as biomedical imaging, n-colour printing, etc. An ICS defines a sub-set of the iccMAX specification, but does not specify any additional elements already defined in iccMAX. An introduction to ICS documents can be found in ICC White Paper 54.
Its been ten years since open data first broke onto the global stage. Over the past decade, thousands of programmes and projects around the world have worked to open data and use it to address social and economic challenges. Meanwhile, issues related to data rights and privacy have moved to the centre of public and political discourse. As the open data movement enters a new phase in its evolution, shifting to target real-world problems and embed open data thinking into other existing or emerging communities of practice, big questions still remain. One of these discourses is on Indigenous Data Sovereignty (IDS) which refers to the right of Indigenous people to control data from and about their communities and lands.
This chapter of the book “The State of Open Data: Histories and Horizons” deals with the topic of IDS. It has emerged as an important topic over the last three years, raising fundamental questions about assumptions of ownership, representation, and control in open data communities. Ideas from IDS provide a challenge to dominant discourses in open data, questioning current approaches to data ownership, licensing, and use in ways that resonate beyond Indigenous contexts, drawing attention to the power and post-colonial dynamics within many data agendas.
Recommendations for pictures - raster graphics
Recommendations for Audio-data
Recommendations for pictures - vector graphics and CAD-Data
Recommendations for Databases
Recommendations for Geodata
Recommendations for PDF-Files
IANUS is a research data center for archaeology and ancient studies in Germany based at the German Archaeological Institute in Berlin. IANUS was funded by the DFG as a joint project from 2011 to 2017.
IANUS IT recommendations provide background information and hands-on tips by primarily addressing the exchange, long-term archiving and reusability of digital research data.
This source is currently only available in German.
IANUS is a research data center for archaeology and ancient studies in Germany based at the German Archaeological Institute in Berlin. IANUS was funded by the DFG as a joint project from 2011 to 2017.
IANUS IT recommendations provide background information and hands-on tips by primarily addressing the exchange, long-term archiving and reusability of digital research data.
This source is currently only available in German.
IANUS is a research data center for archaeology and ancient studies in Germany based at the German Archaeological Institute in Berlin. IANUS was funded by the DFG as a joint project from 2011 to 2017.
IANUS IT recommendations provide background information and hands-on tips by primarily addressing the exchange, long-term archiving and reusability of digital research data.
This source is only available in German.
This portal shows a list of the Coordinate Reference Systems included in PROJ. You can filter by type, authority, name and location. Selecting the area of use of any CRS displays it as a rectangle in the map.
The explorer is up to date, with every version of PROJ (released every 3 months). Alternative directories like sr.org and epsg.io are outdated.
The expIorder includes exactly all the CRSs from PROJ, including EPSG, ESRI, IGNF, etc. represented in WKT2! cf. the announcement of the author on Mastodon
Author: Javier Jimenez Shaw
The management of digital research data is an important new field of research that has emerged in the process of digitalization. For sustainable research data management (RDM), scientists need not only knowledge and skills in their research area, but also additional competencies in dealing with digital data. Ideally, this knowledge should already be taught during studies. In addition, there is an increasing demand for research support staff, e.g. in the form of data stewards, which can only be met through appropriate training and continuing education measures.
This learning objective matrix summarizes relevant teaching contents and associated learning objectives for the qualification levels Bachelor, Master, PhD and Data Steward from a number of national and international projects and training concepts in the field of RDM in a consistent form and offers subsequent users an orientation for identifying relevant content aspects as well as a working basis, e.g. for an extended subject-specific or event-specific design.
LIDO is an XML schema intended for delivering metadata, for use in a variety of online services, from an organization’s collections database to portals of aggregated resources, as well as exposing, sharing and connecting data on the web. Its strength lies in its ability to support the typical range of descriptive information about objects of material culture. It can be used for all kinds of object, e.g., art, cultural, technology and natural science and supports multilingual portal environments.
The LIDO schema is the result of a substantial redesign and enhancement of the CDWA Lite and museumdat schemas based on recommendations of the CDWA Lite/museumdat Working Group, community feedback and further CIDOC-CRM analysis. It mainly builds on CDWA and includes additional concepts to meet SPECTRUM requirements.
Linked Art is a metadata application profile that is mainly based on the CIDOC Conceptual Reference Model. It adds various implementation-related decisions and patterns that are not suitable for the more conceptual level of the ontology. This ranges from RDF and Linked Open Data as a carrier model to the JSON-LD format for serialisations, recommendations on specific modelling patterns and minor extensions based on the use cases of the participating institutions. The scope of Linked Art is also narrower than that of the full CRM, as the focus is on art museum use cases and not on overall knowledge management in cultural heritage.
In order to increase the quality of the data set and the potential for later use, research data should be published in a form that is easy to find and understand. To achieve this, the data must be enriched with additional information, known as metadata. Well-designed and documented metadata therefore plays a key role in finding, understanding and reusing research data.
This service provides basic knowledge about the use of metadata and a variety of helpful links and references on the topic of research data management.
This resource is currently only available in German.
The Minimum Record Recommendation is an online manual that specifies the most important data fields for the online publication of object information from museums and collections. “Minimum Record” stands for the smallest possible intersection of important data fields across disciplines and museum types. This list of data fields can be the basis for more in-depth cataloguing. The Minimum Record Recommendation is an application profile of the LIDO data schema. The recommendation is also compatible with the Europeana Data Model. The guideline focuses on controlled vocabularies and compliance with the FAIR and CARE principles.
It was developed by the AG Minimaldatensatz.
This guide aims to make it easier for researchers working on social, behavioural and economic sciences projects to get started in research data management (RDM). It focuses on individual research projects or small-scale, temporary project associations that intend to build up a data store for their long-term use and/or to make it available, but which have not yet resorted to using institutional infrastructures. This brochure gives an overview, limiting its focus to data in the social, behavioural and economic sciences. It is application-oriented and does not claim to be comprehensive.
Guidelines for archaeological excavations and prospections in Baden-Württemberg. These also contain a large number of important and required lists and forms as well as thesauri.
Please note: NFDI4Objects aims to provide the current guidelines of the federal states in the incubator. However, we cannot guarantee that the currently valid and legally effective regulations are always linked here. In any case, please contact the relevant authorities.
We would be very grateful for information on updated regulations. Please contact the NFDI4Objects Helpdesk.
This source is only available in German.
Guidelines for archaeological excavations and prospections in Bavaria, as well as a large number of lists and forms.
Please note: NFDI4Objects aims to provide evidence of the current guidelines of the federal states in the incubator. However, we cannot guarantee that the currently valid and legally effective regulations are always linked here. In any case, please contact the relevant authorities.
We would be very grateful for information on updated regulations. Please contact the NFDI4Objects Helpdesk.
This source is only available in German.
Guidelines for archaeological excavations and prospections in Berlin, as well as a form for reporting finds and information on exploration.
Please note: NFDI4Objects aims to provide evidence of the current guidelines of the federal states in the incubator. However, we cannot guarantee that the currently valid and legally effective regulations are always linked here. In any case, please contact the relevant authorities.
We would be very grateful for information on updated regulations. Please contact the NFDI4Objects Helpdesk.
This source is only available in German.
Guidelines for archaeological excavations and prospections in Brandenburg, as well as “Appendices to the documentation guidelines” (lists and protocols) and thesauri in the GIS template folder.
Please note: NFDI4Objects aims to provide evidence of the current guidelines of the federal states in the incubator. However, we cannot guarantee that the currently valid and legally effective regulations are always linked here. In any case, please contact the relevant authorities.
We would be very grateful for information on updated regulations. Please contact the NFDI4Objects Helpdesk.
This source is only available in German.
Guidelines for archaeological excavations and prospections in Bremen.
Please note: NFDI4Objects aims to provide evidence of the current guidelines of the federal states in the incubator. However, we cannot guarantee that the currently valid and legally effective regulations are always linked here. In any case, please contact the relevant authorities.
We would be very grateful for information on updated regulations. Please contact the NFDI4Objects Helpdesk.
This source is only available in German.
Guidelines for archaeological excavations and prospections in Hesse, plus a form for archaeobotanical samples and guidelines for paleontology and the Messel Pit.
Please note: NFDI4Objects aims to provide evidence of the current guidelines of the federal states in the incubator. However, we cannot guarantee that the currently valid and legally effective regulations are always linked here. In any case, please contact the relevant authorities.
We would be very grateful for information on updated regulations. Please contact the NFDI4Objects Helpdesk.
This source is only available in German.
Guidelines for archaeological excavations and prospections in Mecklenburg-Vorpommern, as well as a large number of important and required lists, forms and thesauri.
Please note: NFDI4Objects endeavors to verify the current guidelines of the federal states in the incubator. However, we cannot guarantee that the currently valid and legally effective regulations are always linked here. In any case, please contact the relevant authorities.
We are very grateful for information on updated regulations. Please contact the NFDI4Objects Helpdesk.
This source is only available in German.
Guidelines for archaeological excavations and prospections in Lower Saxony, plus an attachment to the guidelines as well as “Guidelines for Underwater Cultural Heritage”.
Please note: NFDI4Objects endeavors to verify the current guidelines of the federal states in the incubator. However, we cannot guarantee that the currently valid and legally effective regulations are always linked here. In any case, please contact the relevant authorities.
We are very grateful for information on updated regulations. Please contact the NFDI4Objects Helpdesk.
This source is only available in German.
Guidelines for archaeological excavations and prospections in North Rhine-Westphalia (LVR) with a file for finds lists, instructions for submitting SfM photos, thesauri, a program for checking the documentation and a form for recording the most important data of a measure.
Please note: NFDI4Objects endeavors to verify the current guidelines of the federal states in the incubator. However, we cannot guarantee that the currently valid and legally effective regulations are always linked here. In any case, please contact the relevant authorities.
We are very grateful for information on updated regulations. Please contact the NFDI4Objects helpdesk.
This source is only available in German.
Guidelines for archaeological excavations and prospections in North Rhine-Westphalia (LWL), also “Fundzettel”, “ Fundbegleitzettel” and various brochures; thesauri only available via the Adiuvabit information system.
Please note: NFDI4Objects endeavors to review the current guidelines of the federal states in the incubator. However, we cannot guarantee that the currently valid and legally effective regulations are always linked here. In any case, please contact the relevant authorities.
We are very grateful for information on updated regulations. Please contact the NFDI4Objects Helpdesk.
This source is only available in German.
Guidelines for archaeological excavations and prospections in the city of Hamburg and the district of Harburg.
Please note: NFDI4Objects aims to provide evidence of the current guidelines of the federal states in the incubator. However, we cannot guarantee that the currently valid and legally effective regulations are always linked here. In any case, please contact the relevant authorities.
We would be very grateful for information on updated regulations. Please contact the NFDI4Objects Helpdesk.
This source is only available in German.
The document contains a metadata schema for training materials on the topic of research data management. This schema was created by UAG Training/Continuing Education within the DINI/nestor Research Data Working Group and implemented in the collection of RDM training materials at https://rs.cms.hu-berlin.de/uag_fdm/.
(Source)
This source is currently only available in German.
Based on the Lernzielmatrix zum Themenbereich Forschungsdatenmanagement (FDM) für die Zielgruppen Studierende, PhDs und Data Stewards, a training course was developed offering this target group an opportunity to familiarize themselves with the topic of data formats in six self-study steps. The knowledge acquired in this way is tested after completing each theory. The content focuses on the various aspects of data formats. The course starts with the difference between data format and file format before describing various requirements for FAIR data formats and an archetype perspective from the engineering sciences. The concluding chapter provides best practices and illustrative examples.
This course is currently only available in German.
The aim of the study is to map the parameters, formats, standards, benchmarks, methodologies and guidelines relating to 3D digitisation of tangible cultural heritage.
The overall objective is to further the quality of 3D digitisation projects by enabling cultural heritage professionals, institutions, content-developers, stakeholders and academics to define and produce high-quality digitisation standards for tangible cultural heritage. This unique study identifies key parameters of the digitisation process, estimates the relative complexity and how it is linked to technology, its impact on quality and its various factors. It also identifies standards and formats used for 3D digitisation, including data types, data formats and metadata schemas for 3D structures. Finally, the study forecasts the potential impacts of future technological advances on 3D digitisation.
This study was commissioned by the EU-Commission to help advance 3D digitisation across Europe and thereby to support the objectives of the Recommendation on a common European data space for cultural heritage (C(2021) 7953 final), adopted on 10 November 2021.
The Recommendation encourages Member States to set up digital strategies for cultural heritage, which sets clear digitisation and digital preservation goals aiming at higher quality through the use of advanced technologies, notably 3D.
The article provides an overview on the historical discussions and developments about the ethical handling of Indigenous data and introduces the CARE Principles (Collective benefit, Authority to control, Responsibility, and Ethics), which focus on the concerns of Indigenous peoples regarding the sharing, curation and reuse of their data. In addition, the authors highlight the current relationship between archaeological research in Canada and the CARE Principles and present examples of the integration of research processes, data practices and the CARE Principles. In the appendix, the authors provide a possible implementation of the CARE-Principles into the guidance regarding archaeological data management plans of the Society for American Archaeology (SAA).
In this first formal publication of the CARE Principles for Indigenous Data Governance, both the principles themselves and their predecessors are explained. In addition to the accompanying concepts and supporting initiatives, the connections and areas of tension between the “people-centric” approach of the CARE Principles and the “data-centric” approach of principles such as FAIR and the Open Data Charter are explained.
Concerns about secondary use of data and limited opportunities for benefit-sharing have focused attention on the tension that Indigenous communities feel between (1) protecting Indigenous rights and interests in Indigenous data (including traditional knowledge) and (2) supporting open data, machine learning, broad data sharing, and big data initiatives. The International Indigenous Data Sovereignty Interest Group (within the Research Data Alliance) is a network of nation-state based Indigenous data sovereignty networks and individuals that developed the ‘CARE Principles for Indigenous Data Governance’ (Collective Benefit, Authority to Control, Responsibility, and Ethics) in consultation with Indigenous Peoples, scholars, non-profit organizations, and governments.
The notion that data should be Findable, Accessible, Interoperable and Reusable, according to the FAIR Principles, has become a global norm for good data stewardship and a prerequisite for reproducibility. Nowadays, FAIR guides data policy actions and professional practices in the public and private sectors. Despite such global endorsements, however, the FAIR Principles are aspirational, remaining elusive at best, and intimidating at worst. To address the lack of practical guidance, and help with capability gaps, we developed the FAIR Cookbook, an open, online resource of hands-on recipes for “FAIR doers” in the Life Sciences. Created by researchers and data managers professionals in academia, (bio)pharmaceutical companies and information service industries, the FAIR Cookbook covers the key steps in a FAIRification journey, the levels and indicators of FAIRness, the maturity model, the technologies, the tools and the standards available, as well as the skills required, and the challenges to achieve and improve data FAIRness. Part of the ELIXIR ecosystem, and recommended by funders, the FAIR Cookbook is open to contributions of new recipes.
The cookbook introduced in the paper can be found here.
The FAIR Data Maturity Model can act as a tool that can be used by various stakeholders, including researchers, data stewards, policy makers and funding agencies, to gain insight into the current FAIRness of data as well as into the aspects that can be improved to increase the potential for reuse of research data. Through increased efficiency and effectiveness, it helps research activities to solve societal challenges and to support evidence-based decisions.
This Comment is the first formal publication of the FAIR Principles, and includes the rationale behind them, and some exemplar implementations in the community. A diverse set of stakeholders — representing academia, industry, funding agencies, and scholarly publishers — came together to design and jointly endorse a concise and measurable set of principles that is referred to as the FAIR Data Principles
This article describes these four foundational principles — Findability, Accessibility, Interoperability, and Reusability — that serve to guide data producers and publishers.
Following a year-long public discussion and building on existing community consensus, several stakeholders, representing various segments of the digital repository community, have collaboratively developed and endorsed a set of guiding principles to demonstrate digital repository trustworthiness. Transparency, Responsibility, User focus, Sustainability and Technology: the TRUST Principles provide a common framework to facilitate discussion and implementation of best practice in digital preservation by all stakeholders.
The “Archiving” working group of the Association of State Archaeologies in the Federal Republic of Germany deals with generally available methodological and technical approaches to archiving digital archaeological data and provides recommendations, suggestions and tips for setting up digital archives. The current experiences of the “Archiving” working group are summarised in topic sheets on the archiving of digital data (all in German Language):
Local Contexts is a global initiative that supports Indigenous communities with tools that can reassert cultural authority in heritage collections and data. The Traditional Knowledge (TK) and Biocultural (BC) Labels are tools for Indigenous communities and local organizations. Developed through sustained partnership and testing within Indigenous communities across multiple countries, the Labels allow communities to express local and specific conditions for sharing and engaging in future research and relationships in ways that are consistent with already existing community rules, governance, and protocols for using, sharing, and circulating knowledge and data. On this webpage Local Contexts describes the purpose, the different groups and the right use of these labels.
This learning module on the topic of The CARE Principles is part of the Train-the-Trainer Programme on research data management version 5.0 of the Sub-Working Group Education and Training of the DINI/nestor Working Group. It serves as a conceptual basis for preparing and conducting train-the-trainer workshops on the topic. In addition to the concept itself, workshop materials such as slides, worksheets and templates, an example of a teaching script and an explanation of the methods used are also published. Regarding content, the learning module includes an introduction and explanations to The CARE Principles, which were developed as a guideline for the handling of indigenous data. The relationship between the CARE and FAIR Principles is also discussed, as well as a demonstration of how The CARE Principles can be applied in research projects.
As part of the BMBF project FDMentor, a German-language train-the-trainer program on the topic of research data management (RDM) was created, which will be continuously supplemented and updated after the end of the project by members of the DINI/nestor Research Data Working Group. The topics covered include both the content-related aspects of research data management as well as units on didactic principles and the development of teaching and workshop concepts. The concept also contains a collection of didactic methods.
For the fifth version of the train-the-trainer concept, which has now been published, the content has been transferred to Latex for more sustainable reusability and a new design has been added. For each unit, suggestions for implementation options and the use of didactic methods for both face-to-face and online events have been integrated. These have been made easier to understand by using a color scheme. Interactive info boxes summarize the materials associated with the individual units (presentation slides, work materials). The train-the-trainer concept is available for download as a pdf. The accompanying materials folder contains both the presentation slides for day 1 and day 2 and separated into individual units as pptx and pdf, as well as work materials such as worksheets or handouts for editing in docx and for direct printing as pdf.
The teaching scripts for the individual units have been updated and interactive elements have been added. A teaching script is available as a pdf for each unit, which contains all the essential information. In addition, an overall teaching script for all units (face-to-face and online workshop) is provided as an editable xlsx, which can be filtered according to specific units and formats.
All units for version 5.0 have been checked to ensure they are up to date and the content of individual units has been updated. The learning objectives listed at the beginning of each unit have been updated in line with the learning objectives matrix for the FDM subject area (DOI: 10.5281/zenodo.8010617). In addition to this version, extension modules on the topics CARE principles (DOI: 10.5281/zenodo.10197070), FAIR principles (DOI: 10.5281/zenodo.10197079), electronic lab book (DOI: 10.5281/zenodo.10197096), re-use of research data (DOI: 10.5281/zenodo.10160865) and software management plan (DOI: 10.5281/zenodo.10197107) will be published. It is planned to publish further thematic modules in the course of time.
This source is currently only available in German.
(Source)
XML template for the Research Data Management Organiser developed as part of the eHumanities-interdisziplinär project. This DFG questionnaire is based on the DFG questionnaire of the FoDaKo project. However, the structure has been changed compared to the FoDaKo plans and supplemented to meet the requirements of the handout from Review Board 101 on handling research data. The plan follows the five sections in the guidelines:
The Research Data Management Organiser, RDMO for short, is a tool for creating data management plans. It was developed as part of a DFG-funded project and is now maintained by an RDMO working group in which anyone can participate.
Many research institutions and universities maintain productive or test instances of the RDMO software. An overview of the institutions and contact persons can be found on the project homepage in the section “Kooperationsnetzwerk”.
The paper presents eleven strategies to make training in reproducible research and open science practices the norm in research institutions. The strategies, which emerged from a virtual brainstorming event in collaboration with the German Network for Reproducibility, focus on three areas:
Brief overview of each strategy, tips for implementation and links to resources.