by Karlye Dilts Stedman, Amaze Lusompa & Phillip An
Disentangling how the economy responds to a monetary policy decision from its response to macroeconomic conditions at the time of the decision is an ongoing challenge. One popular method researchers use to measure the effect of a monetary policy announcement—high-frequency identification—analyzes the reaction of fast-moving financial variables immediately following the policy announcement, using a time window long enough for markets to respond but not so long that the response is contaminated by other information.
Since high-frequency identification was introduced in the early 2000s, policymakers have introduced tools such as forward guidance and large-scale asset purchases. Karlye Dilts Stedman, Amaze Lusompa, and Phillip An examine how the evolution of monetary policy has changed high-frequency identification and assess whether additional changes might be necessary to better capture the effect of modern monetary policy surprises. Although researchers have continually updated the asset mix used in high-frequency identification over time, they have not updated the measurement window. Because the timing of monetary policy communication has changed significantly in recent years, refining the length of this measurement window may be necessary going forward.
When objects interact with light in particular ways — by absorbing or reflecting it — we see in color. A sunset’s orange hues and the ocean’s deep blues inspire artists and dazzle observant admirers. But colors are more than pretty decor; they also play a critical role in life. They attract mates, pollinators and seed-spreaders, and signal danger. And the same color can mean different things to different organisms: A red bird might attract a mate, while a red berry might warn off a hungry human.
For color to communicate meaning, systems to produce it had to evolve, by developing pigments to absorb certain wavelengths of light or structures to reflect them. Organisms also had to produce the machinery to perceive color. When you look out into a forest, you might see lush greenery dappled with yellowish sunlight and pink blooms. But this forest scene would look different if you were a bird or a fly. Color-perception machinery — which include photoreceptors in our eyes that recognize and distinguish light — can differ between species. While humans can’t see ultraviolet light, some birds can. While dogs can’t see red or green, many humans can. Even within species there’s some variation: People who are colorblind have trouble distinguishing some combinations, such as green and red. And many organisms can’t see color at all.
Within one planet, many colorful worlds exist. But how did colors evolve in the first place?
What’s New and Noteworthy
To pinpoint when different kinds of color signals may have evolved, researchers recently reviewed many papers, covering hundreds of millions of years of evolutionary history, to bring together information from the fossil record and phylogenetic trees (diagrams that depict evolutionary relationships between species). Their analysis across the tree of life suggested that color signals likely evolved much later than color vision. It’s likely that color vision evolved twice, developing independently in arthropods and fish, between 400 million and 500 million years ago. Then plants started using bright colors to attract pollinators and animals to disperse their seeds, and then animals started using colors to warn off predators and eventually to attract mates.
One of the most common colors that we see in nature is green. However, this isn’t a color signal: It’s a result of photosynthesis. Most plants absorb almost all the photons in the red and blue light spectra but only 90% of the green photons. The remaining 10% are reflected, making the plants appear green to our eyes. But why did they evolve to do this? According to a model, this makes photosynthetic machinery more stable, suggesting that sometimes evolution favors stability over efficiency.
The majority of colors in nature are produced by pigments that absorb or reflect different wavelengths of light. While many plants can produce these pigments on their own, most animals can’t; instead, they acquire pigments from their diet. Some pigments, though, are hard to acquire, so some animals instead rely on nanoscale structures that scatter light in particular ways to create “structural colors.” For example, the shell of the blue-rayed limpet has layers of transparent crystals, each of which diffracts and reflects a sliver of the light spectrum. When the layers grow to a precise thickness, around 100 nanometers, the wavelengths in each layer interact with one another, canceling each other out — except for blue. The result is the appearance of a bright blue limpet shell.
“Oh that. We just took some undergraduate history students on board as interns. They provided the content and it was done.”
The co-founder of a digital heritage initiative promoting interactiveuser interfaces offered these opening remarks. Speaking at a Delhi-based museum, he had been asked about the information provided to users as they moved their hands across an interactive board, revealing images and narratives relating to the Indian freedom movement. His response clarified that the physical and digital components of such installations—for example, the 3D-modelingsoftware and hardware, scanning equipment and its resolution and the user interface—were more carefully designed and calibrated than the content they provided.
Contemporary cultural heritage (CH) is rife with digital innovation. The COVID pandemic accelerated this transformation as archivists and curators worked to develop content that would reach remote, locked-down audiences. Within significant limits, digital platforms can democratize and facilitate access to materials previously inaccessible. Instead of being physically siloed, digitized material—as data components and not just content on culture—can be reproduced, combined, and circulated infinitely to achieve a reach previously considered impossible. Accessibility and malleability remain one of the great boons of digital formats. But here, we consider the information economy of CH practice as it exists—and not its extraordinary and often hypothetical potential—in two, overlapping realms of digitized CH: for-profit business enterprises and academic side-hustles, related to more mainstream academic research.
In the former, questions of what is shared are often less significant than the appeal of the format. In the latter, innovation is often the result of short-term projects that languish, abandoned after project completion, and rarely find audiences. Our research builds on our individual experiences and the findings of a scoping exercise examining a number of India-based heritage projects conducted in 2021-22. It suggests the need for more careful consideration of the implications of transforming CH materials into forms of data; the change impacts everything from how we understand “originality” to the reliance on for-profit services to deliver heritage material to the public.
As digitized representations of CH and access to such formats become more widespread, are we, as CH practitioners and academics, giving enough thought to how digital technologies are reshaping the nature of CH and its audience? Beyond questions of wider reach, are we sufficiently acknowledging how these changes challenge a continued focus on originality and notions of academy as primary controllers of access to knowledge and its validity, both in research and practice?
Digitizing for Dissemination
In 2019, one of us—Deborah Sutton—developed a software platform, Safarnama, including an app and authored experiences around Delhi’s CH. The project subsequently extended to Karachi. Generating “original” content, such as audio-visual clips and old photos, to be hosted on the app platform, was key to its attractiveness and usefulness, but permissions proved tricky. Some collaborators who were initially keen to contribute content quietly withdrew, likely due to the unfamiliar format and unknown reach. The app format also raised other questions. Would incorporating content from non-digital but publishedscholarship require authorial permission or only acknowledgement?
In 2020, Krupa Rajangam held a sponsored incubation at the NSRCEL, a business incubator located at the Indian Institute of Management-Bangalore, to develop a web interface that would host geo-locationed stories of marginalized histories by drawing on both historical facts and lived experiences. Corporate mentors remained skeptical of her ability to source “original” content on an ongoing basis, i.e., content that was both authenticated and validated. They repeatedly advised her to focus on the format, user experience, and appeal for “mass markets” so her prototype would find audiences. Both projects equally raised questions over who would consume the content and what constitutes the public or audience.
Our exploratory surveys firmly established the divergence in interpreting both CH and digital technologies, which was not surprising. Some projects defined and treated CH as fixed pre-existing material, to be interpreted and presented to audiences through digital technologies. Others re-framed digital formats of CH as components of data, assembling, manipulating, and representing extant archival and other materials. The rest generated digitizedCH, effectively altering its nature. Typically, such projects dealt with more ephemeral or less conventional forms of CH.
Fundamental Transformations
Notions of originality remain central to art, architectural and art historical training, and CH practice. Digitization transforms the access and retrieval value of “original” material in physical archives, such as old maps and letters, much lauded in traditional “analog” scholarship, to use value as data. Once the end-user (audience) accesses this data (whether historical facts or stories), it becomes nothing more than bytes occupying valuable space, to be deleted once consumed rather than stored, making it easy to overlook or disregard the source and its context.
For example, in the Safarnama project, the app contained carefully collected and authenticated narratives on “partition memories” in Delhi and Karachi. However, the bite-sized media format meant that users would only explore content once, as snippets. This realization led the team to develop the software and incorporate the ability to download content, which at least meant that users could collect, organize and store (archive) the assembled media.
Digitization also takes away the materiality of the archive, making it more ephemeral. Non-digital materials through, and into which we render CH can (in endless combinations and cycles) be lost, forgotten, sold, recovered, collected, displayed, and stored. Such capacities of digital files are obvious, but maintaining access depends on varied and dynamic software ecologies for existence and sustained end-user access. Digital files created within one software-architecture can be incompatible with, and therefore rendered obsolete, by another. The ethos of software development is constant change.
In another paper, we examined questions of quantity, quality, and reusability of data related to digitization of building-crafts knowledge alongside CARE and FAIR principles of data management. The principles were proposed and adopted by an international consortium of scholars and industry, the former focused on responsible collection, use, and dissemination of data, especially related to vulnerable people and the latter on sustainable data management.
As an example, one AHRC project experimented with methods to capture detailed 3D images of heritage sites and structures in dynamic crowded environments. They used one set of methods to capture the interiors and another for the exteriors, hoping to merge both together and develop holistic imagery for audiences. This proved impossible at first due to issues of software compatibility. Once that was partially resolved, the new software couldn’t handle the sheer volume of data captured—and it was unclear where and for how long such volumes of data would be stored.
New realms of intellectual property remain fuzzy. While the content on digital platforms is governed by licensing and proprietary legal frameworks, it is often hosted on open platforms, through web repositories such as GitHub. Prima facie, such openness appears to challenge the proprietorial nature of archives and other repositories as keepers of knowledge. However, it raises a host of questions about how to maintain a critical understanding of archives.
Digitization may, and should, transform access but should it obliterate the regimes through which the materials were generated and organized and what’s included or excluded? For example, a local coordinator of one project that engaged with artists commented that digital technologies are typically used to document technical skills as forms of intangible heritage and develop artistencyclopedias, saying that “they are hardly used to interrogate the reality that many ‘traditional’ artists hail from marginalizedcastes.” Similarly, the local coordinator of another project that engaged with communities living in and around a protected heritage site commented on how digital technologies often end up being used to create a record of heritage structures without any reference to their day-to-day setting.
Any and all digital enterprise in CH, we argue, needs to integrate the ambition to use digital methods to not just present but also counter and interrogate the material, its creation, and purpose. Digital platforms and web- and app-based software are now able to manipulate and re-situate information in unprecedented ways. The novelty of such formats can displace original, provocative, and timely considerations of the material. Often, we are so taken by the visual and structural attributes of these formats, that we accept it at face value and lose sight of the tone and content of heritage as a curated message about the past and the present.
Alongside this, digital augmentations and iterations of CH, including storage, have significant financial and infrastructural implications. The creation and maintenance of digital platforms requires either developing “in-house” digital specialization or, more commonly, reliance on private, for-profit platforms. Paying for external provision introduces complexities. Funders, including the AHRC, struggle to devise guidance or policy in relation to software licensing. However, a persistent challenge to projects, and partnerships between academic and non-academic partners, is devising data and software strategies that subsist beyond the life of the funded-research project. Often, the adverse effects of the paucity of longer-term planning around IP issues, sustainability, and data archiving falls disproportionately on the non-academic stakeholder.
While digitization foregrounds the potential and promise of complete openness and equity, maybe this is lost in practice. Or digitization may merely mark the displacement of one set of ethics with another. There is a need for more careful consideration of the implications, complexities, and risks of taking CH materials out of boxes and off shelves and transforming and generating it into data files, which are, in turn, dependent on digital platforms to provide end-user access. However, the question remains of whether heritage-related disciplines are adequately prepared and willing to confront such new ways of working, which have begun to dislodge some of the privileges extant in current forms of research and practice.
Krupa Rajangam is nearing the end of her tenure as a Fulbright Fellow at the Historic Preservation Department, Weitzman School of Design, University of Pennsylvania. Her permanent designation is Founder-Director, Saythu…linking people and heritage, a professional conservation collective based in Bangalore, India.
Although previous studies suggest anthropogenic forcing may influence extreme precipitationprobability, few have specifically investigated the human influence on moisture transport. Here, we leverage the 2023 record-breaking summer precipitation in Northern China (NC) to address this gap. Combining station observation with Coupled Model Intercomparison Project Phase 6 (CMIP6) model outputs, we demonstrate that the 2023-like heavy precipitation event was exacerbated by anthropogenic enhanced moisture transport. External forcing increased the probability of extreme southeasterly moisture transport by approximately 1.3 (90% confidence interval: 1.0–1.8) times. Moreover, the total anthropogenic forcing likely increased the probability of similar precipitation events at least 1.7 times (1.0–3.1), with both greenhouse gases and anthropogenic aerosols contributing positively. As greenhouse gases concentrations rise and anthropogenic warming intensifies, the frequency of similar extreme precipitation events in NC is projected to increase further.
The extent of human influence on moisture transport and consequent heavy precipitation remains a critical research question. While anthropogenic contributions to precipitation extremes are increasingly recognized, studies specifically addressing human-induced changes in moisture transport remain limited. The record-breaking summer precipitation in Northern China (NC) during 2023 provides a salient case study. This extreme event was fueled by substantial moisture transport from the southeast into NC, driven by TyphoonsDoksuri and Khanun. Attribution analyses indicate that both greenhouse gas and anthropogenic aerosol emissions likely increased the probability of similar heavy precipitation events and associated moisture transport patterns. Such events are projected to become more frequent with continued anthropogenic warming. These findings demonstrate that human activities significantly influence moisture transport pathways and consequently modulate extreme precipitation occurrence in NC, deepening our understanding of the physical mechanisms underlying these events.
The slower momentum is concerning because, as we show in a new staff discussion note, green innovation is not only good for containing climate change, but for stimulating economic growth too. As the world confronts one of the weakest five-year growth outlooks in more than three decades, those dual benefits are particularly appealing. They ease concerns about the costs of pursuing more ambitious climate plans. And when countries act jointly on climate, we can speed up low-carbon innovation and its transfer to emerging markets and developing economies.
IMF research [archived PDF] shows that doubling green patent filings can boost gross domestic product by 1.7 percent after five years compared with a baseline scenario. And that’s under our most conservative estimate—other estimates show up to four times the effect.
A key question is how countries can better foster green innovation and its deployment. We highlight how domestic and global climate policies spur green innovation. For example, a big increase in the number of climate policies tends to boost green patent filings, our preferred proxy for green innovation, by 10 percent within five years.
One reason policy synchronization has a prominent impact on domestic green innovation is what is called the market size effect. There’s more incentive to develop low-carbon technologies if innovators can expect to sell into a much larger potential market, that is, in countries which adopted similar climate policies.
Another is that climate policies in other countries generate green innovations and knowledge that can be used in the domestic economy. This is known as technology diffusion. Finally, synchronized policy action and international climate commitments create more certainty around domestic climate policies, as they boost people’s confidence in governments’ commitment to addressing climate change.
The risks of protectionism are exacerbated when climate policies, such as subsidies, do not abide by international rules. For example, local content requirements, whereby only locally produced green goods benefit from subsidies, undermine trust in multilateral trade rules and could result in retaliatory measures.
Scientists have gained new insight into why thermal runaway, while rare, could cause a resting battery to overheat and catch fire.
In order to better understand how a resting battery might undergo thermal runaway after fast charging, scientists are using a technique called “operando X-ray microtomography” to measure changes in the state of charge at the particle level inside a lithium-ion battery after it’s been charged.
Their work shows for the first time that it is possible to directly measure current inside a resting battery even when the external current measurement is zero.
Much more work is needed before the findings can be used to develop improved safety protocols.
How likely would an electric vehiclebattery self-combust and explode? The chances of that happening are actually pretty slim: Some analysts say that gasolinevehicles are nearly 30 times more likely to catch fire than electric vehicles. But recent news of EVs catching fire while parked have left many consumers – and researchers – scratching their heads over how these rare events could possibly happen.
“What’s exciting about this work is that Nitash Balsara’s group isn’t just looking at images – They’re using the images to determine how batteries work and change in a time-dependent way. This study is a culmination of many years of work,” said co-author Dilworth Y. Parkinson, staff scientist and deputy for photonscience operations at Berkeley Lab’s Advanced Light Source (ALS).
The team is also the first to measure ionic currents at the particle level inside the batteryelectrode.
“What happens after fast charging when the battery is at rest is a little mysterious,” Balsara said. But the method used for the new study revealed important clues.
Experiments led by first author Alec S. Ho at the ALS show that when graphite is “fully lithiated” or fully charged, it expands a tiny bit, about a 10% change in volume – and that current in the battery at the particle level could be determined by tracking the local lithiation in the electrode. (Ho recently completed his Ph.D. in the Balsara group at UC Berkeley.)
The researchers also learned that the measured internal currents decreased substantially in about 20 minutes. Much more work is needed before their approach can be used to develop improved safety protocols.
Several movies give you an “enchanting” back door or window into chemistry so that you can “beat” the tediousness of regular education and come into the field and its topics via these movies:
I.
The Man in the White Suit is a 1951 British comedy classic with Alec Guinness as a genius research chemist. He fiddles with his flasks and polymer and textile chemistry experiments until he invents a fabric that shows no wear and tear “forever.” This would seem like a great boon to humanity in its clothing needs but the chemist (“Sidney Stratton”) finds that both labor and management reject his discovery violently as it threatens jobs and profits. Textile or fabric polymer chemistry is at the heart of the plot.
RDX was used by both sides in World War II. The U.S. produced about 15,000 long tons per month during WWII and Germany about 7,000 long tons per month. RDX had the major advantages of possessing greater explosive force than TNT, used in World War I and requiring no additional raw materials for its manufacture.
Semtex was developed and manufactured in Czechoslovakia, originally under the name B 1 and then under the “Semtex” designation since 1964, labeled as SEMTEX 1A, since 1967 as SEMTEX H, and since 1987 as SEMTEX 10. Originally developed for Czechoslovakmilitary use and export, Semtex eventually became popular with paramilitary groups and rebels or terrorists because prior to 2000 it was extremely difficult to detect, as in the case of Pan Am Flight 103.
A suspicious device resembling those used in the bombings was found and defused in an apartment block in the Russian city of Ryazan on 22 September. On 23 September, Vladimir Putin praised the vigilance of the inhabitants of Ryazan and ordered the air bombing of Grozny, which marked the beginning of the Second Chechen War. Three FSB agents who had planted the devices at Ryazan were arrested by the local police, with the devices containing a sugar-like substance resembling RDX.
We are told that on an evening in 1889, Mr. Holmes was seated in 221B Baker Street at the deal table loaded with retorts and test tubes. He was settling down to one of those all-night chemical researches in which he frequently indulged.
The research work was interrupted by a message of distress from Violet Hunter. Watson found that there was a train the next morning, and Holmes tells Watson:
“That will do very nicely. Then perhaps I had better postpone my analysis of the acetones as we may need to be at our best in the morning.”
“In the fractional distillation of coal-tar, the distillate separates into five distinct groups or layers, depending upon the stage of the process and the amount of heat applied. Category-one of the five includes benzene, toluene, xylenes and cumenes.
It means that many more qubits, the basic calculating unit, can be joined together than is possible on a single microchip. This will make a more powerful quantum computer possible.
The scaling of qubit numbers from the current level of around 100 qubits to nearer 1 million is central to creating a quantum processor that can make useful calculations.
The significant achievement is based on a technical blueprint for creating a large-scale quantum computer, which was first published in 2017 with funding from EPSRC.
Their development may help solve pressing challenges from drug discovery to energy-efficient fertilizer production. But their impact is expected to sweep across the economy, transforming most sectors and all our lives.
Potential to scale up
Winfried Hensinger, Professor of Quantum Technologies at the University of Sussex and Chief Scientist and co-founder at Universal Quantum said:
The researchers were successful in transporting the qubits using electrical fields with a 99.999993% success rate and a connection rate of 2424 transfers per second. Both numbers are world records.
Dr. Kedar Pandya, Director of Cross-Council Programmes at EPSRC, said:
This significant milestone is evidence of how EPSRC funded science is seeding the commercial future for quantum computing in the UK.
The potential for complex technologies, like quantum, to transform our lives and create economic value widely relies on visionary early-stage investment in academic research.
We deliver that crucial building block and are delighted that the University of Sussex and its spin-out company, Universal Quantum, are demonstrating the strength it supports.
The background of immigrants to China is becoming more diverse. While the number of high-earning expatriates from developed countries has peaked, China is now also attracting more students than ever from all over the world, including many from lesser developed countries. Low-skilled labor and migration for marriage are also on the rise. The main areas that attract foreigners are the large urban centers along the coast (Guangzhou, Shanghai, Beijing) and borderland regions in the South, Northeast and Northwest, but smaller numbers are also making their way to smaller cities across China.
According to their analysis, for many foreigners China has become considerably less accommodating over the last ten years, particularly with regard to border control, public security, visa categories, and work and residence permits. China’s immigration policy is still driven by narrow concerns of regulation, institutionalization and control. It remains predicated on attracting high-quality professionals, researchers, entrepreneurs and investors. Long-term challenges like the emerging demographic transition, remain to be addressed.
The authors detect a worrying trend towards intolerance to ethnic and racial difference, fed by increasing nationalism and ethnicchauvinism. They argue that the Chinese government, civil society, foreign diplomatic missions, employers of foreigners and international organizations present in China should take a clear stance against racism and discrimination. China’s immigration policy needs to include the integration of foreigners into society and provide clear and predictable paths to acquiring permanent residence.