Looking Back to Look Forward

Winston Churchill said, “The farther back you can look, the farther forward you are likely to see.”

The brilliant baseball player and coach Satchel Paige seems to disagree with Churchill when he said, “Don’t look back. Something might be gaining on you.”

Marc Bloch, in The Historian’s Craft (French: Apologie pour l’histoire), wrote that history is obviously a backward-looking discipline, but warns against the obsession with origins.

Edward Bellamy’s utopian time travel novel, Looking Backward: 2000–1887, is another example of this thought. His protagonist has a prophetic dream in 1888 of the United States in the year 2000. The book critiques the 19th-century U.S. through the lens of the future.

Alain Badiou looks back from the Neolithic period to today, describing it as a “time of crisis.”

…everybody thinks there is a crisis. Is philosophy capable of seizing hold of this crisis, while maintaining its fundamental aims? That is obviously my position I certainly recognize that humanity is in crisis, which I take to be the final spasm of the whole Neolithic period, the period of classes, of private property, of the power of the state, of technology, and so on. This started in Egypt and China six or seven thousand years ago and now this ends up in what is after all a very difficult situation to control. It is the outcome of everything that this gigantic period has swept along with it. This includes the status of truths, which today are perhaps a bit domesticated by an uncontrollable situation of predation and destruction.

After all, technology is tributary to science; everything is supposed to be mediated by information, even aesthetics; love has become calculable because you can calculate scientifically the person who best matches with you. All this indeed is at the origin of a gigantic crisis in philosophy. My own position is that we can be in a position of active resistance to what is happening, while holding onto the original categories of philosophy. A form of resistance that nevertheless consists in dramatically changing into something else. We should not hope to reform the world such as it is: I think this is completely impossible. Of course, one can try to do the best one can, but little by little everyone recognizes that the world we live in is catastrophic. And that is certainly true. It is catastrophic because it is the end—and here we should think big—of several millennia. It is not just the end of the nineteenth and twentieth centuries; it is the end of the world of social classes, of inequalities, of state power, of the subservience to science and technology, of private property colonizing everything, of senseless and criminal wars.

Alain Badiou, Badiou by Badiou, translated by Bruno Bosteels, Stanford University Press, 2022, pages 26-27.

Badiou argues that the world has always been threatened by catastrophe and philosophy is its reaction.

Let us recall that Socrates and Plato were people who already intervened at the end of the Greek city. They too found themselves in a world threatened by catastrophe: they did not live in a stable and established world at all. That ends with Alexander the Great, who brings order to all this in the form of an imperial creation, and finally with the Romans and their monster of a state the likes of which had never been seen before. The Greek city and Greek democracy thus ended in the imperialism of ancient Rome. Thus, we may also find inspiration in Plato in this last regard. Plato is the first complete philosopher, but he already lives in a time of crisis. Of course, Athens was very famous and celebrated, but at the same time it was already corrupted and fragile. During Plato’s own lifetime, not to mention Aristotle, Macedonian imperialism is already present. Aristotle was Alexander the Great’s first tutor; he was a prototype of the corrupted and, moreover, the inventor of academic philosophy!

Similarly, if we take the greatest philosophersPlato, Descartes, Hegel—we again find the same type of figure. Hegel is obviously the philosopher caught up in the French Revolution and its fundamental transformations; Descartes, for his part, is caught up in the emergence of modern science. All these philosophers are caught up in considerable shakeups of their time, in the fact that an old society is on the verge of dying and the question of what is going to appear that is new. We too find ourselves in the same situation: we must continue along these lines, by taking inspiration from what those philosophers did. Thus, they considered that the moment had come to work on a renewed systematicity of philosophy, because the conditions had changed. So, based on the conditions as they existed, it was time to propose an innovative way out of the existing constraints, an individual and collective liberation. From this point of view, we can find inspiration in the great classical philosophical tradition: we need not reject it, nor claim that all this is finished and find solace in an insurmountable nihilism, nor adopt the Heideggerian critique of metaphysics going back all the way to Plato. All this is pointless, and finally becomes incorporated into the disorder of the world. On the contrary, we must hold onto the fact that philosophy has always been particularly useful, possible, and necessary in situations of grave crisis for the collective, and from there pursue the work of our great predecessors.

Alain Badiou, Badiou by Badiou, translated by Bruno Bosteels, Stanford University Press, 2022, pages 29-30.

Contrast “What was the Neolithic world that led to the unleashing of technology?” (Badiou, Badiou by Badiou, page 25) and “Yesterday don’t matter if it’s gone.” (The Rolling Stones, “Ruby Tuesday”). Perhaps we can conclude that wisdom is knowing when the past is useful in understanding the future.

Digitizing Heritage: Exploring the Transformation of Culture to Data

[from India in Transition by the Center for the Advanced Study of India at the University of Pennsylvania, 1 September 2025]

by Krupa Rajangam & Deborah Sutton

“Oh that. We just took some undergraduate history students on board as interns. They provided the content and it was done.

The co-founder of a digital heritage initiative promoting interactive user interfaces offered these opening remarks. Speaking at a Delhi-based museum, he had been asked about the information provided to users as they moved their hands across an interactive board, revealing images and narratives relating to the Indian freedom movement. His response clarified that the physical and digital components of such installations—for example, the 3D-modeling software and hardware, scanning equipment and its resolution and the user interface—were more carefully designed and calibrated than the content they provided.

Contemporary cultural heritage (CH) is rife with digital innovation. The COVID pandemic accelerated this transformation as archivists and curators worked to develop content that would reach remote, locked-down audiences. Within significant limits, digital platforms can democratize and facilitate access to materials previously inaccessible. Instead of being physically siloed, digitized material—as data components and not just content on culture—can be reproduced, combined, and circulated infinitely to achieve a reach previously considered impossible. Accessibility and malleability remain one of the great boons of digital formats. But here, we consider the information economy of CH practice as it exists—and not its extraordinary and often hypothetical potential—in two, overlapping realms of digitized CH: for-profit business enterprises and academic side-hustles, related to more mainstream academic research.

In the former, questions of what is shared are often less significant than the appeal of the format. In the latter, innovation is often the result of short-term projects that languish, abandoned after project completion, and rarely find audiences. Our research builds on our individual experiences and the findings of a scoping exercise examining a number of India-based heritage projects conducted in 2021-22. It suggests the need for more careful consideration of the implications of transforming CH materials into forms of data; the change impacts everything from how we understand “originality” to the reliance on for-profit services to deliver heritage material to the public.

As digitized representations of CH and access to such formats become more widespread, are we, as CH practitioners and academics, giving enough thought to how digital technologies are reshaping the nature of CH and its audience? Beyond questions of wider reach, are we sufficiently acknowledging how these changes challenge a continued focus on originality and notions of academy as primary controllers of access to knowledge and its validity, both in research and practice?

Digitizing for Dissemination

In 2019, one of us—Deborah Sutton—developed a software platform, Safarnama, including an app and authored experiences around Delhi’s CH. The project subsequently extended to Karachi. Generating “original” content, such as audio-visual clips and old photos, to be hosted on the app platform, was key to its attractiveness and usefulness, but permissions proved tricky. Some collaborators who were initially keen to contribute content quietly withdrew, likely due to the unfamiliar format and unknown reach. The app format also raised other questions. Would incorporating content from non-digital but published scholarship require authorial permission or only acknowledgement?

In 2020, Krupa Rajangam held a sponsored incubation at the NSRCEL, a business incubator located at the Indian Institute of Management-Bangalore, to develop a web interface that would host geo-locationed stories of marginalized histories by drawing on both historical facts and lived experiences. Corporate mentors remained skeptical of her ability to source “original” content on an ongoing basis, i.e., content that was both authenticated and validated. They repeatedly advised her to focus on the format, user experience, and appeal for “mass markets” so her prototype would find audiences. Both projects equally raised questions over who would consume the content and what constitutes the public or audience.

In a scoping exercise undertaken for the Arts and Humanities Research Council (AHRC), UK, in 2021-22, we explored a number of India-based heritage projects funded by the AHRC in partnership with the Newton Fund and Indian Council for Historical Research, since 2015 (figure 1). We were particularly interested in the digital components, which all projects included, even if only a website.

Our exploratory surveys firmly established the divergence in interpreting both CH and digital technologies, which was not surprising. Some projects defined and treated CH as fixed pre-existing material, to be interpreted and presented to audiences through digital technologies. Others re-framed digital formats of CH as components of data, assembling, manipulating, and representing extant archival and other materials. The rest generated digitized CH, effectively altering its nature. Typically, such projects dealt with more ephemeral or less conventional forms of CH.

Fundamental Transformations

Notions of originality remain central to art, architectural and art historical training, and CH practice. Digitization transforms the access and retrieval value of “original” material in physical archives, such as old maps and letters, much lauded in traditional “analog” scholarship, to use value as data. Once the end-user (audience) accesses this data (whether historical facts or stories), it becomes nothing more than bytes occupying valuable space, to be deleted once consumed rather than stored, making it easy to overlook or disregard the source and its context.

For example, in the Safarnama project, the app contained carefully collected and authenticated narratives on “partition memories” in Delhi and Karachi. However, the bite-sized media format meant that users would only explore content once, as snippets. This realization led the team to develop the software and incorporate the ability to download content, which at least meant that users could collect, organize and store (archive) the assembled media.

Digitization also takes away the materiality of the archive, making it more ephemeral. Non-digital materials through, and into which we render CH can (in endless combinations and cycles) be lost, forgotten, sold, recovered, collected, displayed, and stored. Such capacities of digital files are obvious, but maintaining access depends on varied and dynamic software ecologies for existence and sustained end-user access. Digital files created within one software-architecture can be incompatible with, and therefore rendered obsolete, by another. The ethos of software development is constant change.

In another paper, we examined questions of quantity, quality, and reusability of data related to digitization of building-crafts knowledge alongside CARE and FAIR principles of data management. The principles were proposed and adopted by an international consortium of scholars and industry, the former focused on responsible collection, use, and dissemination of data, especially related to vulnerable people and the latter on sustainable data management.

As an example, one AHRC project experimented with methods to capture detailed 3D images of heritage sites and structures in dynamic crowded environments. They used one set of methods to capture the interiors and another for the exteriors, hoping to merge both together and develop holistic imagery for audiences. This proved impossible at first due to issues of software compatibility. Once that was partially resolved, the new software couldn’t handle the sheer volume of data captured—and it was unclear where and for how long such volumes of data would be stored.

New realms of intellectual property remain fuzzy. While the content on digital platforms is governed by licensing and proprietary legal frameworks, it is often hosted on open platforms, through web repositories such as GitHub. Prima facie, such openness appears to challenge the proprietorial nature of archives and other repositories as keepers of knowledge. However, it raises a host of questions about how to maintain a critical understanding of archives.

Digitization may, and should, transform access but should it obliterate the regimes through which the materials were generated and organized and what’s included or excluded? For example, a local coordinator of one project that engaged with artists commented that digital technologies are typically used to document technical skills as forms of intangible heritage and develop artist encyclopedias, saying that “they are hardly used to interrogate the reality that many ‘traditional’ artists hail from marginalized castes.” Similarly, the local coordinator of another project that engaged with communities living in and around a protected heritage site commented on how digital technologies often end up being used to create a record of heritage structures without any reference to their day-to-day setting.

Any and all digital enterprise in CH, we argue, needs to integrate the ambition to use digital methods to not just present but also counter and interrogate the material, its creation, and purpose. Digital platforms and web- and app-based software are now able to manipulate and re-situate information in unprecedented ways. The novelty of such formats can displace original, provocative, and timely considerations of the material. Often, we are so taken by the visual and structural attributes of these formats, that we accept it at face value and lose sight of the tone and content of heritage as a curated message about the past and the present.

Alongside this, digital augmentations and iterations of CH, including storage, have significant financial and infrastructural implications. The creation and maintenance of digital platforms requires either developing “in-house” digital specialization or, more commonly, reliance on private, for-profit platforms. Paying for external provision introduces complexities. Funders, including the AHRC, struggle to devise guidance or policy in relation to software licensing. However, a persistent challenge to projects, and partnerships between academic and non-academic partners, is devising data and software strategies that subsist beyond the life of the funded-research project. Often, the adverse effects of the paucity of longer-term planning around IP issues, sustainability, and data archiving falls disproportionately on the non-academic stakeholder.

While digitization foregrounds the potential and promise of complete openness and equity, maybe this is lost in practice. Or digitization may merely mark the displacement of one set of ethics with another. There is a need for more careful consideration of the implications, complexities, and risks of taking CH materials out of boxes and off shelves and transforming and generating it into data files, which are, in turn, dependent on digital platforms to provide end-user access. However, the question remains of whether heritage-related disciplines are adequately prepared and willing to confront such new ways of working, which have begun to dislodge some of the privileges extant in current forms of research and practice.

Krupa Rajangam is nearing the end of her tenure as a Fulbright Fellow at the Historic Preservation Department, Weitzman School of Design, University of Pennsylvania. Her permanent designation is Founder-Director, Saythu…linking people and heritage, a professional conservation collective based in Bangalore, India.

Deborah Sutton is a Professor in Modern South Asian History at Lancaster University.

Monomania and the West

There have been all kinds of “voices” in the history of Western civilization. Perhaps the loudest voice is that of monomaniacs, who always claim that behind the appearance of the many is the one. If we illustrate the West, and at its roots, the intersection of Athens and Jerusalem, we see the origins of this monomania. Plato’s realm of ideas was supposed to explain everything encountered in our daily lives. His main student and rival, Aristotle, has his own competing explanation, based in biology instead of mathematics.

These monomanias in their modern counterpart in ideologies. In communism, the key to have everything is class and the resulting class struggles. Nazism revolves around race and racial conflict.

In our own era, the era of scientism, we have the idea of god replaced with Stephen Hawking’s “mind of god,” Leon Lederman’s The God Particle and KAKU Michio’s The God Equation. In the 2009 film, Angels & Demons, there’s a senior Vatican official, played by Ewan McGregor, who is absolutely outraged by the blasphemous phrase, “the god particle.”

Currently, the monomania impetus continues full-force. For example, Professor Seth Lloyd of MIT tells us that reality is the cosmos and not chaos, because all of reality together is a computer. His MIT colleague, Max Tegmark, argues in his books that the world is not explained by mathematics, but rather is mathematics. Perhaps the climax of this kind of thinking is given to us by the essay “Everything Is Computation” by Joscha Bach:

These days we see a tremendous number of significant scientific news stories, and it’s hard to say which has the highest significance. Climate models indicate that we are past crucial tipping points and irrevocably headed for a new, difficult age for our civilization. Mark van Raamsdonk expands on the work of Brian Swingle and Juan Maldacena and demonstrates how we can abolish the idea of spacetime in favor of a discrete tensor network, thus opening the way for a unified theory of physics. Bruce Conklin, George Church, and others have given us CRISPR/Cas9, a technology that holds promise for simple and ubiquitous gene editing. “Deep learning” starts to tell us how hierarchies of interconnected feature detectors can autonomously form a model of the world, learn to solve problems, and recognize speech, images, and video.

It is perhaps equally important to notice where we lack progress: Sociology fails to teach us how societies work; philosophy seems to have become infertile; the economic sciences seem ill-equipped to inform our economic and fiscal policies; psychology does not encompass the logic of our psyche; and neuroscience tells us where things happen in the brain but largely not what they are.

In my view, the 20th century’s most important addition to understanding the world is not positivist science, computer technology, spaceflight, or the foundational theories of physics.

It is the notion of computation. Computation, at its core, and as informally described as possible, is simple: Every observation yields a set of discernible differences.

These we call information. If the observation corresponds to a system that can change its state, we can describe those state changes. If we identify regularity in those state changes, we are looking at a computational system. If the regularity is completely described, we call this system an algorithm. Once a system can perform conditional state transitions and revisit earlier states, it becomes almost impossible to stop it from performing arbitrary computation. In the infinite case that is, if we allow it to make an unbounded number of state transitions and use unbounded storage for the states—it becomes a Turing machine, or a Lambda calculus, or a Post machine, or one of the many other mutually equivalent formalisms that capture universal computation.

Computational terms rephrase the idea of “causality,” something that philosophers have struggled with for centuries. Causality is the transition from one state in a computational system to the next. They also replace the concept of “mechanism” in mechanistic, or naturalistic, philosophy. Computationalism is the new mechanism, and unlike its predecessor, it is not fraught with misleading intuitions of moving parts.

Computation is different from mathematics. Mathematics turns out to be the domain of formal languages and is mostly undecidable, which is just another word for saying “uncomputable” (since decision making and proving are alternative words for computation, too). All our explorations into mathematics are computational ones, though. To compute means to actually do all the work, to move from one state to the next.

Computation changes our idea of knowledge: Instead of justified true belief, knowledge describes a local minimum in capturing regularities between observables. Knowledge is almost never static but progresses on a gradient through a state space of possible worldviews. We will no longer aspire to teach our children the truth, because, like us, they will never stop changing their minds. We will teach them how to productively change their minds, how to explore the never-ending land of insight.

A growing number of physicists understands that the universe is not mathematical but computational, and physics is in the business of finding an algorithm that can reproduce our observations. The switch from uncomputable mathematical notions (such as continuous space) makes progress possible. Climate science, molecular genetics, and AI are computational sciences. Sociology, psychology, and neuroscience are not: They still seem confused by the apparent dichotomy between mechanism (rigid moving parts) and the objects of their study. They are looking for social, behavioral, chemical, neural regularities, where they should be looking for computational ones.

Everything is computation.

Know This: Today’s Most Interesting and Important Scientific Ideas, Discoveries, and Developments, John Brockman (editor), Harper Perennial, 2017, pages 228-230.

Friedrich Nietzsche rebelled against this type of thinking the most profoundly. If scientism represents the modern, then Nietzsche was the prophet of postmodernism. Nietzsche’s famous phrase, “God is dead.” is not about a creator or divinity, but rather finality itself. There is no final explanation.

How to Be an Info-Observer and Knowledge Self-Educator: Parachutist Skills

MetaIntelligence is the mental jump where you go from being processed by the system to being the processor of the system.

The U.S. Federal Deposit Insurance Corporation has a “flagship” publication called FDIC Quarterly.

In Volume 15 of this periodical, Number 2, 2021 [PDF] there’s an article called:

The Historic Relationship between Bank Net Interest Margins and Short-Term Interest Rates

(pages 31 to 41)

The authors of this piece have a boxed insert on the first page where they define NIM (Net Interest Margin) which is a phrase and acronym you see in the title above.

The insert begins like this:

Net Interest Margin is a key profitability ratio…

This measure is so popular that banks report it, bank examiners assess it for individual banks, and the FDIC calculates it for the industry every quarter in the “Quarterly Banking Profile.” For a vast majority of banks, net interest income is the primary source of income, and for such banks NIM is a primary component of profitability.

(FDIC Quarterly, 2021, Volume 15, Number 2, page 31)

Such FDIC publications are freely available online and otherwise. Suppose you borrow an issue from the Library or download a copy from their archive and read it attentively. You could begin the process of “parachuting” into something outside of your ken, namely banking and finance.

This learning to become a “parachutist” in knowledge and information is the only way to escape the kind of “house arrest” forced on you by whatever you happened to specialize in in school and if you accept this kind of “knowledge detention” you will always be “stranded on your lonely island” which is not what you want and is potentially a form of “stupidization.” This agility acknowledges the fact that obviously “you can’t major in everything.”

To parachute in and back out of knowledge domains is a profound component of the remedial educational skill we call MetaIntelligence.