Digitizing Heritage: Exploring the Transformation of Culture to Data

[from India in Transition by the Center for the Advanced Study of India at the University of Pennsylvania, 1 September 2025]

by Krupa Rajangam & Deborah Sutton

“Oh that. We just took some undergraduate history students on board as interns. They provided the content and it was done.

The co-founder of a digital heritage initiative promoting interactive user interfaces offered these opening remarks. Speaking at a Delhi-based museum, he had been asked about the information provided to users as they moved their hands across an interactive board, revealing images and narratives relating to the Indian freedom movement. His response clarified that the physical and digital components of such installations—for example, the 3D-modeling software and hardware, scanning equipment and its resolution and the user interface—were more carefully designed and calibrated than the content they provided.

Contemporary cultural heritage (CH) is rife with digital innovation. The COVID pandemic accelerated this transformation as archivists and curators worked to develop content that would reach remote, locked-down audiences. Within significant limits, digital platforms can democratize and facilitate access to materials previously inaccessible. Instead of being physically siloed, digitized material—as data components and not just content on culture—can be reproduced, combined, and circulated infinitely to achieve a reach previously considered impossible. Accessibility and malleability remain one of the great boons of digital formats. But here, we consider the information economy of CH practice as it exists—and not its extraordinary and often hypothetical potential—in two, overlapping realms of digitized CH: for-profit business enterprises and academic side-hustles, related to more mainstream academic research.

In the former, questions of what is shared are often less significant than the appeal of the format. In the latter, innovation is often the result of short-term projects that languish, abandoned after project completion, and rarely find audiences. Our research builds on our individual experiences and the findings of a scoping exercise examining a number of India-based heritage projects conducted in 2021-22. It suggests the need for more careful consideration of the implications of transforming CH materials into forms of data; the change impacts everything from how we understand “originality” to the reliance on for-profit services to deliver heritage material to the public.

As digitized representations of CH and access to such formats become more widespread, are we, as CH practitioners and academics, giving enough thought to how digital technologies are reshaping the nature of CH and its audience? Beyond questions of wider reach, are we sufficiently acknowledging how these changes challenge a continued focus on originality and notions of academy as primary controllers of access to knowledge and its validity, both in research and practice?

Digitizing for Dissemination

In 2019, one of us—Deborah Sutton—developed a software platform, Safarnama, including an app and authored experiences around Delhi’s CH. The project subsequently extended to Karachi. Generating “original” content, such as audio-visual clips and old photos, to be hosted on the app platform, was key to its attractiveness and usefulness, but permissions proved tricky. Some collaborators who were initially keen to contribute content quietly withdrew, likely due to the unfamiliar format and unknown reach. The app format also raised other questions. Would incorporating content from non-digital but published scholarship require authorial permission or only acknowledgement?

In 2020, Krupa Rajangam held a sponsored incubation at the NSRCEL, a business incubator located at the Indian Institute of Management-Bangalore, to develop a web interface that would host geo-locationed stories of marginalized histories by drawing on both historical facts and lived experiences. Corporate mentors remained skeptical of her ability to source “original” content on an ongoing basis, i.e., content that was both authenticated and validated. They repeatedly advised her to focus on the format, user experience, and appeal for “mass markets” so her prototype would find audiences. Both projects equally raised questions over who would consume the content and what constitutes the public or audience.

In a scoping exercise undertaken for the Arts and Humanities Research Council (AHRC), UK, in 2021-22, we explored a number of India-based heritage projects funded by the AHRC in partnership with the Newton Fund and Indian Council for Historical Research, since 2015 (figure 1). We were particularly interested in the digital components, which all projects included, even if only a website.

Our exploratory surveys firmly established the divergence in interpreting both CH and digital technologies, which was not surprising. Some projects defined and treated CH as fixed pre-existing material, to be interpreted and presented to audiences through digital technologies. Others re-framed digital formats of CH as components of data, assembling, manipulating, and representing extant archival and other materials. The rest generated digitized CH, effectively altering its nature. Typically, such projects dealt with more ephemeral or less conventional forms of CH.

Fundamental Transformations

Notions of originality remain central to art, architectural and art historical training, and CH practice. Digitization transforms the access and retrieval value of “original” material in physical archives, such as old maps and letters, much lauded in traditional “analog” scholarship, to use value as data. Once the end-user (audience) accesses this data (whether historical facts or stories), it becomes nothing more than bytes occupying valuable space, to be deleted once consumed rather than stored, making it easy to overlook or disregard the source and its context.

For example, in the Safarnama project, the app contained carefully collected and authenticated narratives on “partition memories” in Delhi and Karachi. However, the bite-sized media format meant that users would only explore content once, as snippets. This realization led the team to develop the software and incorporate the ability to download content, which at least meant that users could collect, organize and store (archive) the assembled media.

Digitization also takes away the materiality of the archive, making it more ephemeral. Non-digital materials through, and into which we render CH can (in endless combinations and cycles) be lost, forgotten, sold, recovered, collected, displayed, and stored. Such capacities of digital files are obvious, but maintaining access depends on varied and dynamic software ecologies for existence and sustained end-user access. Digital files created within one software-architecture can be incompatible with, and therefore rendered obsolete, by another. The ethos of software development is constant change.

In another paper, we examined questions of quantity, quality, and reusability of data related to digitization of building-crafts knowledge alongside CARE and FAIR principles of data management. The principles were proposed and adopted by an international consortium of scholars and industry, the former focused on responsible collection, use, and dissemination of data, especially related to vulnerable people and the latter on sustainable data management.

As an example, one AHRC project experimented with methods to capture detailed 3D images of heritage sites and structures in dynamic crowded environments. They used one set of methods to capture the interiors and another for the exteriors, hoping to merge both together and develop holistic imagery for audiences. This proved impossible at first due to issues of software compatibility. Once that was partially resolved, the new software couldn’t handle the sheer volume of data captured—and it was unclear where and for how long such volumes of data would be stored.

New realms of intellectual property remain fuzzy. While the content on digital platforms is governed by licensing and proprietary legal frameworks, it is often hosted on open platforms, through web repositories such as GitHub. Prima facie, such openness appears to challenge the proprietorial nature of archives and other repositories as keepers of knowledge. However, it raises a host of questions about how to maintain a critical understanding of archives.

Digitization may, and should, transform access but should it obliterate the regimes through which the materials were generated and organized and what’s included or excluded? For example, a local coordinator of one project that engaged with artists commented that digital technologies are typically used to document technical skills as forms of intangible heritage and develop artist encyclopedias, saying that “they are hardly used to interrogate the reality that many ‘traditional’ artists hail from marginalized castes.” Similarly, the local coordinator of another project that engaged with communities living in and around a protected heritage site commented on how digital technologies often end up being used to create a record of heritage structures without any reference to their day-to-day setting.

Any and all digital enterprise in CH, we argue, needs to integrate the ambition to use digital methods to not just present but also counter and interrogate the material, its creation, and purpose. Digital platforms and web- and app-based software are now able to manipulate and re-situate information in unprecedented ways. The novelty of such formats can displace original, provocative, and timely considerations of the material. Often, we are so taken by the visual and structural attributes of these formats, that we accept it at face value and lose sight of the tone and content of heritage as a curated message about the past and the present.

Alongside this, digital augmentations and iterations of CH, including storage, have significant financial and infrastructural implications. The creation and maintenance of digital platforms requires either developing “in-house” digital specialization or, more commonly, reliance on private, for-profit platforms. Paying for external provision introduces complexities. Funders, including the AHRC, struggle to devise guidance or policy in relation to software licensing. However, a persistent challenge to projects, and partnerships between academic and non-academic partners, is devising data and software strategies that subsist beyond the life of the funded-research project. Often, the adverse effects of the paucity of longer-term planning around IP issues, sustainability, and data archiving falls disproportionately on the non-academic stakeholder.

While digitization foregrounds the potential and promise of complete openness and equity, maybe this is lost in practice. Or digitization may merely mark the displacement of one set of ethics with another. There is a need for more careful consideration of the implications, complexities, and risks of taking CH materials out of boxes and off shelves and transforming and generating it into data files, which are, in turn, dependent on digital platforms to provide end-user access. However, the question remains of whether heritage-related disciplines are adequately prepared and willing to confront such new ways of working, which have begun to dislodge some of the privileges extant in current forms of research and practice.

Krupa Rajangam is nearing the end of her tenure as a Fulbright Fellow at the Historic Preservation Department, Weitzman School of Design, University of Pennsylvania. Her permanent designation is Founder-Director, Saythu…linking people and heritage, a professional conservation collective based in Bangalore, India.

Deborah Sutton is a Professor in Modern South Asian History at Lancaster University.

First Clean Energy Cybersecurity Accelerator Participants Begin Technical Assessment

[From the National Renewable Energy Laboratory (NREL) News]

Program Selected Three Participants for Cohort 1

The Clean Energy Cybersecurity Accelerator™ (CECA)’s first cohort of solution providers—Blue Ridge Networks, Sierra Nevada Corporation, and Xage—recently began a technical assessment of their technologies that offer strong authentication solutions for distributed energy resources.

The selected solution providers will take part in a six-month acceleration period, where solutions will be evaluated in the Advanced Research on Integrated Energy Systems (ARIES) cyber range.

Working with its partners, CECA identified urgent security gaps, supporting emerging technologies as they build security into new technologies at the earliest stage—when security is most effective and efficient. The initiative is managed by the U.S. Department of Energy’s (DOE’s) National Renewable Energy Laboratory (NREL) and sponsored by DOE’s Office of Cybersecurity, Energy Security, and Emergency Response (CESER) and utility industry partners in collaboration with DOE’s Office of Energy Efficiency and Renewable Energy (EERE).

“We are thrilled to welcome and work with the first participants to the secure energy transformation,” said Jon White, director of NREL’s Cybersecurity Program Office. “These cyber-solution providers will work with NREL, using its world-class capabilities, to develop their ideas into real-world solutions. We are ready to build security into technologies at the early development stages when most effective and efficient.”

The selected innovators:

Blue Ridge Networks’ LinkGuard system “cloaks” critical information technology network operations from destructive and costly cyberattacks. The system overlays onto existing network infrastructure to secure network segments from external discovery or data exfiltration. Through a partnership with Schneider Electric, Blue Ridge Networks helped deploy a solution to protect supervisory control and data acquisition (SCADA) systems for the utility industry.

Sierra Nevada Corporation (SNC)’s Binary Armor® is used by the U.S. Department of Defense and utilities to protect critical assets, with the help of subject matter experts to deliver cyber solutions. SNC plans to integrate as a software solution into a communication gateway or other available edge processing to provide a scalable solution to enforce safe operation in an unauthenticated ecosystem. SNC currently helps secure heating, ventilation, and air conditioning systems; programmable logical controllers; and wildfire detection, with remote monitoring for two different utilities.

Xage uses identity-based access control to protect users, machines, apps, and data, at the edge and in the cloud, enforcing zero-trust access to secure operations and data universally. To test technology in energy sector environments, Xage provides zero-trust remote access, has demonstrated proofs of concept, and deploys local and remote access at various organizations.

Three major U.S. utilities, with more expected to join, are partners with CECA: Berkshire Hathaway Energy, Duke Energy and Xcel Energy. At the end of each cohort cycle, cyber innovators will present their solutions to the utilities with the goal to make an immediate impact.

Additionally, CECA participants benefit from access to NREL’s unique testing and evaluation capabilities, including its ARIES cyber range, developed with support from EERE. The ARIES cyber range provides one of the most advanced simulation environments with unparalleled real-time situational awareness and visualization to evaluate renewable energy system defenses.

Applications for the second CECA cohort will open in early January 2023 for providers offering solutions that uncover hidden risks due to incomplete system visibility and device security and configuration.

NREL is the U.S. Department of Energy’s primary national laboratory for renewable energy and energy efficiency research and development. NREL is operated for DOE by the Alliance for Sustainable Energy LLC.

New Ultrathin Capacitor Could Enable Energy-Efficient Microchips

Scientists turn century-old material into a thin film for next-gen memory and logic devices

[from Berkeley Lab, by Rachel Berkowitz]

Electron microscope images show the precise atom-by-atom structure of a barium titanate (BaTiO3) thin film sandwiched between layers of strontium ruthenate (SrRuO3) metal to make a tiny capacitor. (Credit: Lane Martin/Berkeley Lab)

The silicon-based computer chips that power our modern devices require vast amounts of energy to operate. Despite ever-improving computing efficiency, information technology is projected to consume around 25% of all primary energy produced by 2030. Researchers in the microelectronics and materials sciences communities are seeking ways to sustainably manage the global need for computing power.

The holy grail for reducing this digital demand is to develop microelectronics that operate at much lower voltages, which would require less energy and is a primary goal of efforts to move beyond today’s state-of-the-art CMOS (complementary metaloxide semiconductor) devices.

Non-silicon materials with enticing properties for memory and logic devices exist; but their common bulk form still requires large voltages to manipulate, making them incompatible with modern electronics. Designing thin-film alternatives that not only perform well at low operating voltages but can also be packed into microelectronic devices remains a challenge.

Now, a team of researchers at Lawrence Berkeley National Laboratory (Berkeley Lab) and UC Berkeley have identified one energy-efficient route—by synthesizing a thin-layer version of a well-known material whose properties are exactly what’s needed for next-generation devices.

First discovered more than 80 years ago, barium titanate (BaTiO3) found use in various capacitors for electronic circuits, ultrasonic generators, transducers, and even sonar.

Crystals of the material respond quickly to a small electric field, flip-flopping the orientation of the charged atoms that make up the material in a reversible but permanent manner even if the applied field is removed. This provides a way to switch between the proverbial “0” and “1” states in logic and memory storage devices—but still requires voltages larger than 1,000 millivolts (mV) for doing so.

Seeking to harness these properties for use in microchips, the Berkeley Lab-led team developed a pathway for creating films of BaTiO3 just 25 nanometers thin—less than a thousandth of a human hair’s width—whose orientation of charged atoms, or polarization, switches as quickly and efficiently as in the bulk version.

“We’ve known about BaTiO3 for the better part of a century and we’ve known how to make thin films of this material for over 40 years. But until now, nobody could make a film that could get close to the structure or performance that could be achieved in bulk,” said Lane Martin, a faculty scientist in the Materials Sciences Division (MSD) at Berkeley Lab and professor of materials science and engineering at UC Berkeley who led the work.

Historically, synthesis attempts have resulted in films that contain higher concentrations of “defects”—points where the structure differs from an idealized version of the material—as compared to bulk versions. Such a high concentration of defects negatively impacts the performance of thin films. Martin and colleagues developed an approach to growing the films that limits those defects. The findings were published in the journal Nature Materials.

To understand what it takes to produce the best, low-defect BaTiO3 thin films, the researchers turned to a process called pulsed-laser deposition. Firing a powerful beam of an ultraviolet laser light onto a ceramic target of BaTiO3 causes the material to transform into a plasma, which then transmits atoms from the target onto a surface to grow the film. “It’s a versatile tool where we can tweak a lot of knobs in the film’s growth and see which are most important for controlling the properties,” said Martin.

Martin and his colleagues showed that their method could achieve precise control over the deposited film’s structure, chemistry, thickness, and interfaces with metal electrodes. By chopping each deposited sample in half and looking at its structure atom by atom using tools at the National Center for Electron Microscopy at Berkeley Lab’s Molecular Foundry, the researchers revealed a version that precisely mimicked an extremely thin slice of the bulk.

“It’s fun to think that we can take these classic materials that we thought we knew everything about, and flip them on their head with new approaches to making and characterizing them,” said Martin.

Finally, by placing a film of BaTiO3 in between two metal layers, Martin and his team created tiny capacitors—the electronic components that rapidly store and release energy in a circuit. Applying voltages of 100 mV or less and measuring the current that emerges showed that the film’s polarization switched within two billionths of a second and could potentially be faster—competitive with what it takes for today’s computers to access memory or perform calculations.

The work follows the bigger goal of creating materials with small switching voltages, and examining how interfaces with the metal components necessary for devices impact such materials. “This is a good early victory in our pursuit of low-power electronics that go beyond what is possible with silicon-based electronics today,” said Martin.

“Unlike our new devices, the capacitors used in chips today don’t hold their data unless you keep applying a voltage,” said Martin. And current technologies generally work at 500 to 600 mV, while a thin film version could work at 50 to 100 mV or less. Together, these measurements demonstrate a successful optimization of voltage and polarization robustness—which tend to be a trade-off, especially in thin materials.

Next, the team plans to shrink the material down even thinner to make it compatible with real devices in computers and study how it behaves at those tiny dimensions. At the same time, they will work with collaborators at companies such as Intel Corp. to test the feasibility in first-generation electronic devices. “If you could make each logic operation in a computer a million times more efficient, think how much energy you save. That’s why we’re doing this,” said Martin.

This research was supported by the U.S. Department of Energy (DOE) Office of Science. The Molecular Foundry is a DOE Office of Science user facility at Berkeley Lab.

Tangled Up Multifactorial Causes

Workers’ Shrinking Share of the Pie

The current issue of the Federal Reserve Bank of Richmond periodical, Econ Focus (second/third quarter 2019) has a good article, “Workers’ Shrinking Share of the Pie” [Archived PDF] which has the following “conclusion:”

So what explains the recent decline in labor’s share? Unfortunately, it is difficult to untangle the separate roles of automation, globalization, and changes in market power. Automation has likely played a role, but its independent impact is hard to gauge, due to the difficulty in differentiating the recent wave of automation from previous episodes in which labor’s share of national income held steady. Globalization appears to have been a strong contributor—a claim that is buttressed by the near simultaneity of the rise in U.S. trade with China and the decline of labor’s share. A variety of evidence also points to firms’ increased pricing power in product markets and workers’ weakened bargaining power in labor markets. In product markets, information technology and globalization appear to have increased the pricing power and profitability of certain dominant firms. And in labor markets, the insecurity engendered by automation and globalization may have helped to weaken workers’ bargaining power. In short, from the perspective of workers, multiple forces have come together to narrow their slice of an expanding economy.

(second/third quarter 2019 Econ Focus, Richmond Fed, page 17)

It so happens that Professor Robert Lawrence of Harvard (Kennedy School) in his very careful analyses, shows how American economic “numbers” in recent years are consistent with internal American numbers from decades ago and this implies the overall transformation of the American economy was primarily from within (“endogenous”) and not till very recently, from without (“exogenous”). Pre-exisiting internal American trends might be the dominant cause for many decades.

The variable or axis “endogenous versus exogenous” (internal pushes or external pulls?) adds a whole dimension to the discussion and we have to therefore avoid mono-causal explanations.

We live not only in a multifactorial world but in one where “inside pushes” can be confused with “external pulls.”