Definitions
Origins
I gave them forgone forever.
Yet when I came again to uncongeal the liquor all my little Animals made their re-appearance, and danced and frisked about as lively as ever.
Henry Power FRS, Experimental philosophy,1664
Elsewhere, however, the science and engineering of low temperatures revolutionised steel manufacturing, enabled rocket propulsion, and created hydrogen bombs. By the mid-20th century, the manufacture and transportation of low temperature liquified gases had been transformed by these undertakings, and the search for further ways in which low temperatures could be used had become intense.
And so, the scene was set for serendipity: the unsought finding that glycols enabled the almost complete revival of biological samples after cryopreservation (Polge, 1949).
Our solutions
Our designs integrate artificial intelligence, applied mathematics and laboratory measurements to turn terabytes of data into powerful new cryopreservation solutions.
Our design cycles begin with a search for the best possible chemical compounds. Over the last 75 years cryopreservation techniques have relied heavily upon just a few compounds. Indeed, it is humbling to note that nearly all of these had been identified by the early 1950s. Their utility is well-established, but so too are their limitations – though many focus on the wrong shortcomings.
$4 trillion
Global research expenditure (1950-2024)
Since the 1950s, roughly $4 trillion has been spent on research and development worldwide. We are applying the new knowledge this has created to improve cryopreservation outcomes.
Our approach to unlocking this store of knowledge is comprehensive and unique, similar in motivation to others but completely different in every essential detail.
Further details
Many databases compile academic literature; PubMed is perhaps the most famous, whilst pre-print archives are rapidly emerging as the largest. It is a challenge for people to keep up with new publications, and even more difficult and time-consuming to effectively navigate the wider corpus of roughly 7 million extant academic papers.
So we don’t.
Instead, we have developed proprietary artificial intelligence systems that collate and process experimental and chemical data.
We search further and deeper than others.
We specialise in navigating the deep web – a source of data roughly 400-times larger than the surface web.
These systems run continuously.
We feed data workloads for analysis from our quantum-secure servers to the custom-coded models we have built and implement on a quasi-supercomputer: Hpc7g instances of the Amazon Elastic Compute Cloud 2.
We evaluate evidence using techniques such as semi-automated bias-weighted Bayesian meta-analyses. Promising compounds are entered into our high-throughput screening systems, which we iterate across target tissues and cell-types.
As candidate compounds emerge, we need to understand how they interact with each other. We do so by implementing optimization techniques developed in applied mathematics and numerical analysis. Specifically, we characterise non-linear interactions across multi-dimensional parameter space, pin-pointing relationships with key categorical outcomes and identifying reproducible global minima.
At this point, hands-on low-throughput experiments become essential.
The outcome used most widely to evaluate cryopreservation solutions and protocols is the proportion of cells that survive after thawing. Dyes are used to measure the integrity of cell membranes, with an intact membrane considered indicative of survival.
Different dyes yield very different results, a fact which many rely on to flatter their data or products.
almarBlue is the most accurate. Trypan Blue appears to improve survival more than 3-fold for otherwise identical samples – from 28% to 95% – but the effect simply isn’t real.
Even viability, itself a poorly defined concept, is not sufficient.
Meaningful survival requires long-term functionality: a conserved ability to multiply in number, for cells to perform their original tasks or, in the hands of our collaborators, to successfully undergo health-giving manipulations.
Our methods provide unique support for cells at every stage to help them achieve these standards.
Low-temperature phenomena
It is essential that there is the closet possible connection between the design of solutions and the protocols developed for their application: a solution will only be effective if it is designed for specific samples, fully specified workflows, and accurately determined temperature profiles.
There are two main approaches to the cryopreservation of biological samples, slow-freezing and vitrification. We have solutions and protocols designed for both.
Slow freezing and vitrification represent fundamentally different ways of navigating the physical and biological phenomena associated with low temperatures. They each present distinct benefits and challenges.
Crucially, the misapplication of either approach will completely destroy biological samples.
This happens every day in research laboratories across the world, and represents a major impediment to the progress of contemporary research. With greater care and improved methods, it is avoidable.
Slow freezing
Slow freezing is the most widely used approach to cryopreservation. It is based on Mazur’s two-factor hypothesis (1972), itself a synthesis of Luyet’s ice crystal mechanics (1949), Lovelock’s salt biochemistry (1953) and Merryman’s osmosis (1968, 1971).
The aim is to find a rate of cooling that is slow enough to prevent the formation of extensive macroscopic ice within cells, but not so fast as to form microscopic ice in its place – and to do all this whist ensuring cells avoid intolerable osmotic changes. The ideal would be to follow the liquidus curve, which demarcates the point below which ice forms, but its determination is challenging in both theory and practice – and varies across a 500-fold range.
As a result, there is usually some amount of ice formed under these protocols.
Vitrification
Vitrification provides an alternative approach to cryopreservation and aims to completely avoid the formation of ice. It too involves taking samples to a sufficiently low temperature that solids have the same unstructured molecular arrangement as a liquid, but insufficient thermal energy for translational rearrangement. As such, there is no change of thermodynamic phase, just tremendously high viscosity. The key difference to slow-cooling is that samples aim to reach these temperatures without prolonged intermediate cooling.
Traditional vitrification protocols aim to achieve this by increasing cryoprotectant concentrations without changing temperature, to raise the glass transition temperature, then cooling at the greatest possible rate until they pass this threshold. The aim is to pass so rapidly through the unstable supercooled phase that there is insufficient time for ice to form.
Crucially, it has proven challenging to predict the threshold of the glass transition temperature using theoretical predictions or empirical measurements. This reduces science to the “hit-and-hope” methods of low-level golf.
We have developed an alternative approach. By combining mathematical models with new forms of spectroscopy we have built a unique ability to accurately calculate glass transition temperatures.
Warming
Whichever approach you take to reaching low temperatures, the use and study of most biological samples requires their return to temperatures above 0º C. This means once again crossing the range of temperatures at which ice forms.
For samples that were vitrified this may mean ice forming for the first time. For samples that were frozen slowly, there can be similar novel ice growth, but they also face a further challenge: recrystallization.
During recrystallization small innocuous micro-crystals of ice rapidly aggregate to form ice crystals which are much larger and more damaging. The consequences can be devastating: a seemingly well preserved sample without visible ice can be completely destroyed during warming.
In broad terms, the greatest possible rate of warming is desirable – but this is difficult to achieve, especially for larger samples.
Ice recrystallization inhibitors can reduce the rate of temperature change required for successful warming. Most of these compounds act by binding to the face of ice crystals, though molecular dynamics simulations reveal important specific differences.
Many compounds have been shown to reduce ice recrystallization, and there has been considerable interest in the synthesis of compounds which have improved performance or pathways to commercialisation.
Notably, many naturally-occurring anti-freeze proteins such as the one illustrated below affect recrystallization but also yield hysteresis loops and shape ice crystals more profoundly. Our approach has been to prioritise compounds with this broader range of beneficial effects.
Crystal structure of hyperactive, Ca2+-dependent,
beta-helical antifreeze protein from an Antarctic bacterium. https://www.wwpdb.org/pdb?id=pdb_00003p4g
Crystal structure of hyperactive Type I antifreeze
from a winter flounder
https://www.wwpdb.org/pdb?id=pdb_00004ke2
Further details
A wide-range of topics and techniques inform our design of protocols. Relevant recent studies of biological structures, macromolecules, cryoprotectants, lipids and water have used spectroscopy (nuclear magnetic resonance, infrared, Raman, sum frequency generation (SFG)), neutron diffraction, and neutron scattering. The crucial molecular dynamics of cryoprotectants interacting with water, lipids or solvated proteins may be studied quantum mechanically or atomistically, on the basis of mesoscopic particles or using continuum theory.
Our pragmatic approach to this proliferation of methods has been to prioritize mathematical models based on accurate empirical measurements of key biophysical characteristics of each cell or tissue type. Many of these are affected in a dose-dependent manner by cryoprotectants.
Such measurements are time-consuming, but essential: our values for parameters such as permeability, diffusivity, and viscosity – the foundations upon which models are built – diverge substantially from values in the published literature. We have identified and avoided the technical mistakes that led others to commit these errors. The revised numbers have far-reaching implications: protocols developed using inaccurate numbers simply cannot succeed.
We apply these measurements to develop system-specific models based on mass-transport, network thermodynamics and diffusion. Using high-performance computing we build models in which diffusion may be concentration-dependent, non-Fickian, or spanning heterogenous media and moving boundaries. By using a range of methods we are able to identify the conclusions which are secure in the face of variation.
As part of this we undertake experiments to arbitrate between models based on the alignment between their predictions and our measurements. Finally, we use Gaussian processes and random Fourier features to optimise our protocols and understand their parameter space.
To the best of our knowledge, this is a level of analysis unmatched in the history of cryopreservation.
Making the possible, practical.
Many cryopreservation protocols are too costly or complex to be used widely or reliably. They are reported in academic literature, exciting descriptions of happenings in research laboratories, but they never become part of the real-world.
Our protocols make the possible, practical.
And easy.
Because easy is the point at which implementation takes-off, and creates a new normal.
Usability powered the Information Age, and can do the same for biotechnology.
COPYRIGHT © COOPER-WHYTE LIMITED 2024.
ALL RIGHTS RESERVED