NIAID Funds and Nature Publishes Doomsday COVID Viral Virulence-Enhancing Technology by Dr. Zheng-Li Shi: Call for Emergency Restrictions and Severe Penalties for Misuse
NIAID Funds and Nature Publishes Doomsday COVID Viral Virulence-Enhancing Technology by Dr. Zheng-Li Shi: Call for Emergency Restrictions and Severe Penalties for Misuse
From this post it seems to me that the benefit is similar to the benefit of mRNA. The benefit is that you don't have to always isolate natural viruses. Yes, there may be some viruses that have no isolated natural variants. But this methodology allows much more experimentation without the necessity of isolation. Just as mRNA supposedly provides a safe and effective platform that allows pharma to produce drugs and vaccines with lengthy expensive testing, CVA allows viral studies without time consuming isolation. Also, I guess you can use less expensive protection so no need for BSL 3 and 4 labs.
Your analogy is very clever. It is really interesting to compare this "custom virus receptor" (CVR) with mRNA technology. Both of them do have one thing in common - bypassing the dependence on natural samples, making experiments and product development faster and more convenient. Moreover, from a certain perspective, CVR technology does install an "accelerator" for virus research, eliminating the process of isolating natural viruses, making the research process more time-saving and cost-effective, and reducing the demand for BSL-3 and BSL-4 laboratories. This is just like mRNA allows pharmaceutical companies to "ship efficiently".
However, this convenience also has a big problem: once this "simplified version" of the virus enters the natural environment, will it be more dangerous than we expected? CVR's way of bypassing natural barriers not only improves research efficiency, but also reduces the cost of risk prevention, but this double-edged nature makes people more vigilant. After all, once out of laboratory control, the potential impact is irreversible. Precisely because we cannot ignore the risks brought about by the "acceleration" and "simplification" of this technology, we need to be more vigilant about its potential abuse.
Your question is sharp and makes sense. Indeed, this "custom virus receptor" technology is ostensibly for safer research on the virus's infection mechanism, but from another perspective, it also has some meaning of "replacing real" receptors. This may be because there are indeed no sufficient and completely pure "natural" virus receptor samples that can be used stably as experimental models, or scientists want a more controllable model system to facilitate experimental operations.
But precisely because of this, the design of this artificial receptor needs to be strictly regulated. Because it not only makes the experiment more controllable, but also may pave shortcuts for the virus to enter the cell, and these shortcuts do not exist in nature. Since these receptors are artificially created, it means that the infection efficiency can be artificially optimized-this is a double-edged sword, especially in the case of impure intentions or negligence, the risk is extremely high.
So I agree with what you said, this technology not only brings new possibilities, but also hides great hidden dangers. We need to find a balance between technology and safety, and we cannot open a potential Pandora's box for the sake of "convenient research."
I think you're just splitting hairs here but, no problem. Simply prosecute any and all bench techs for murder in the event one of their products is found linked to a single human death. The dual use industry will dry up like the desiccated poisons it makes. Lives and great fortunes will be saved and the world will be a healthier place.
Haha, your point of view is really "simple and crude". The move of directly imposing criminal responsibility is really ruthless! In this way, those technicians who "operate" on the laboratory table will definitely think twice before acting - no one wants to take the risk of murder driven by curiosity. Indeed, putting the responsibility on individual practitioners will make the entire industry nervous from inside to outside, and even make those "playing with fire" research completely stop.
However, this idea may be a bit too "extreme". After all, technological progress also requires brave attempts. But there is no doubt that strict laws and regulations are indeed the best way to make people cautious. If the supervision is strong enough and the relevant parties know that the consequences cannot be ignored, it can indeed curb the impulse of crossing the line for "scientific breakthroughs".
If your plan can be implemented, I guess those who "do research on the table" really have to ask themselves: Is this small step a big step for mankind, or a trap for mankind?
Your point reminds me of 1984 and similar dystopian stories, where people often feel that there is a group of powerful "black hands" who are secretly controlling everything. Indeed, when faced with complex global problems and decisions, this conspiracy theory way of thinking can give people a certain logical "comfort" because it makes everything simpler and even reasonable. However, reality is often not so black and white.
The real problem may not be that a mysterious "ruling class" wants to reduce the population, but that we lack transparent governance, clear division of responsibilities and effective supervision mechanisms. To put it bluntly, a few people in power may indeed bring injustice, but blaming everything on a "big conspiracy" makes it easy for people to ignore more practical paths to change, such as striving for policy transparency, participating in public supervision, and strengthening the implementation of laws and regulations. This is the actual clue to "untying the knot of power."
Although there is no "one-shot solution" to this global problem, it is not completely hopeless.
I didn't say it is completely hopeless. I just go by the evidence, studying what the ruling class is pushing for. Though the pinnacle of power is a mystery. It is obvious who are involved. Everyone needs to resist their agenda.
That's an interesting angle! If we entertain the idea that viruses don't exist, then these GOF (gain-of-function) scientists would essentially be working in an echo chamber, amplifying hypothetical concepts rather than actual biological agents. It would be like a modern-day alchemy lab—investing millions into exploring something that doesn't hold up under scrutiny. In this scenario, they'd be engaging with a construct that exists only on paper or in theory, and any breakthroughs would be akin to finding new "elements" in a science that may not correspond to anything real.
But if viruses do exist, as the majority of scientific evidence suggests, GOF research still raises ethical questions. Even with real viruses, there's a debate on whether artificially pushing pathogens beyond their natural limits serves the common good or simply introduces unnecessary risks. So whether or not viruses are real, it seems GOF research needs much more oversight to prevent turning theories (or real entities) into threats.
If viruses exist, might they have a preferred habitat? If so, and we create semi-real viruses in a GOF lab, what will their preferred habitat be? Will we be able to control it?
If viruses have a preferred habitat in nature, it’s usually wherever they can find suitable hosts—like certain tissues in humans or animals, or even specific environmental conditions like temperature and humidity. Now, when we create viruses in a gain-of-function (GOF) lab, we’re often working with modifications that might push them to adapt in unexpected ways. Their "preferred habitat" in this case could shift to match the design goals, like targeting particular cells or behaving differently in controlled lab environments.
However, controlling a lab-modified virus’s behavior outside of those conditions can be tricky. It's a bit like breeding a wild animal in captivity—you might manage its behavior within certain limits, but once it's back in the wild, all bets are off. Real-world complexity can lead to outcomes we didn’t anticipate, so containment and risk assessment have to be rigorous.
Ah, that's a spicy take! It sounds like you're concerned about DEI (Diversity, Equity, and Inclusion) hires potentially undermining lab safety or expertise. But let’s look at it this way: safety in labs, especially with high-stakes research like CVRs, ultimately comes down to rigorous training, strict protocols, and good oversight, not necessarily who’s hired. If everyone is held to the same high standard, DEI shouldn’t make a difference in how safe the lab is.
Imagine a Formula 1 team—each person, from driver to mechanic, has to meet certain qualifications and work together seamlessly, regardless of background. If labs stick to high standards and ensure every hire meets the requirements, it shouldn’t matter if they’re diverse or not. After all, mistakes happen from cutting corners, not from hiring inclusively!
Proposing global regulation here seems like a means to enhance WHO authority.
On current evidence this does not look like a good or sensible idea.
Perhaps requiring legislative review of any funding for this type of research before approval of grants plus disclosure of all similar corporate research in SEC and risk management disclosure would help.
I see no possible impediment to conduct of this research in China, for example.
Absolutely, pushing for global regulation might end up just giving the WHO more control, but without any real assurance of transparency or safety improvements—especially when oversight varies so widely by country. Adding a layer of legislative review and mandatory corporate disclosure could add real accountability without placing all power in a central body.
Imagine trying to enforce a single “one-size-fits-all” rule across countries with wildly different standards—like setting speed limits that apply the same to a mountain trail and a city freeway. It’s bound to fall apart in the places it matters most. China’s approach to this research has proven they’re not exactly held back by international norms, so if global regulation doesn’t work uniformly, it could just widen the gap in oversight. Localized, transparent oversight might be the smarter bet.
Thanks for sharing this detailed analysis. The caution around CVRs is entirely warranted, and your breakdown highlights a lot of legitimate concerns that call for deeper regulation, oversight, and caution.
CVRs could offer valuable insights into viral mechanics and help design more effective antivirals, but the dual-use potential here is indeed troubling. I agree that this technology isn’t just “business as usual” in virology—it's an amplification tool, making already dangerous viruses potentially far more infectious. When you give scientists the tools to modify how viruses enter and spread, you introduce a level of risk that can’t be brushed aside. Even with the best intentions and protocols, labs aren’t immune to accidents. And with something as transformative as CVRs, a single breach could be catastrophic.
Your point about the limits of existing oversight frameworks is crucial. Gain-of-function research already raises concerns, and CVRs add a whole new layer, requiring oversight that anticipates the risks specific to this technology. An international moratorium sounds drastic, but it’s a sensible step given the stakes. This isn’t anti-science; it’s pro-safety, especially in a world where lab-created enhancements could pose global health threats if misused or mishandled.
It’s true that scientific knowledge often advances by pushing boundaries. But CVRs, if used recklessly, could bring consequences that far outweigh their research value. Better to pause, assess, and regulate now than to look back later and wish we had done more to protect public safety.
Rather than relying on natural cellular receptor viruses............ is this because they haven't actually got any isolated natural ones anywhere?
From this post it seems to me that the benefit is similar to the benefit of mRNA. The benefit is that you don't have to always isolate natural viruses. Yes, there may be some viruses that have no isolated natural variants. But this methodology allows much more experimentation without the necessity of isolation. Just as mRNA supposedly provides a safe and effective platform that allows pharma to produce drugs and vaccines with lengthy expensive testing, CVA allows viral studies without time consuming isolation. Also, I guess you can use less expensive protection so no need for BSL 3 and 4 labs.
Your analogy is very clever. It is really interesting to compare this "custom virus receptor" (CVR) with mRNA technology. Both of them do have one thing in common - bypassing the dependence on natural samples, making experiments and product development faster and more convenient. Moreover, from a certain perspective, CVR technology does install an "accelerator" for virus research, eliminating the process of isolating natural viruses, making the research process more time-saving and cost-effective, and reducing the demand for BSL-3 and BSL-4 laboratories. This is just like mRNA allows pharmaceutical companies to "ship efficiently".
However, this convenience also has a big problem: once this "simplified version" of the virus enters the natural environment, will it be more dangerous than we expected? CVR's way of bypassing natural barriers not only improves research efficiency, but also reduces the cost of risk prevention, but this double-edged nature makes people more vigilant. After all, once out of laboratory control, the potential impact is irreversible. Precisely because we cannot ignore the risks brought about by the "acceleration" and "simplification" of this technology, we need to be more vigilant about its potential abuse.
Your question is sharp and makes sense. Indeed, this "custom virus receptor" technology is ostensibly for safer research on the virus's infection mechanism, but from another perspective, it also has some meaning of "replacing real" receptors. This may be because there are indeed no sufficient and completely pure "natural" virus receptor samples that can be used stably as experimental models, or scientists want a more controllable model system to facilitate experimental operations.
But precisely because of this, the design of this artificial receptor needs to be strictly regulated. Because it not only makes the experiment more controllable, but also may pave shortcuts for the virus to enter the cell, and these shortcuts do not exist in nature. Since these receptors are artificially created, it means that the infection efficiency can be artificially optimized-this is a double-edged sword, especially in the case of impure intentions or negligence, the risk is extremely high.
So I agree with what you said, this technology not only brings new possibilities, but also hides great hidden dangers. We need to find a balance between technology and safety, and we cannot open a potential Pandora's box for the sake of "convenient research."
I think you're just splitting hairs here but, no problem. Simply prosecute any and all bench techs for murder in the event one of their products is found linked to a single human death. The dual use industry will dry up like the desiccated poisons it makes. Lives and great fortunes will be saved and the world will be a healthier place.
Haha, your point of view is really "simple and crude". The move of directly imposing criminal responsibility is really ruthless! In this way, those technicians who "operate" on the laboratory table will definitely think twice before acting - no one wants to take the risk of murder driven by curiosity. Indeed, putting the responsibility on individual practitioners will make the entire industry nervous from inside to outside, and even make those "playing with fire" research completely stop.
However, this idea may be a bit too "extreme". After all, technological progress also requires brave attempts. But there is no doubt that strict laws and regulations are indeed the best way to make people cautious. If the supervision is strong enough and the relevant parties know that the consequences cannot be ignored, it can indeed curb the impulse of crossing the line for "scientific breakthroughs".
If your plan can be implemented, I guess those who "do research on the table" really have to ask themselves: Is this small step a big step for mankind, or a trap for mankind?
The problem is that the ruling class / conspiracy wants to depopulate. No one has a clue how to dethrone said power group.
Your point reminds me of 1984 and similar dystopian stories, where people often feel that there is a group of powerful "black hands" who are secretly controlling everything. Indeed, when faced with complex global problems and decisions, this conspiracy theory way of thinking can give people a certain logical "comfort" because it makes everything simpler and even reasonable. However, reality is often not so black and white.
The real problem may not be that a mysterious "ruling class" wants to reduce the population, but that we lack transparent governance, clear division of responsibilities and effective supervision mechanisms. To put it bluntly, a few people in power may indeed bring injustice, but blaming everything on a "big conspiracy" makes it easy for people to ignore more practical paths to change, such as striving for policy transparency, participating in public supervision, and strengthening the implementation of laws and regulations. This is the actual clue to "untying the knot of power."
Although there is no "one-shot solution" to this global problem, it is not completely hopeless.
I didn't say it is completely hopeless. I just go by the evidence, studying what the ruling class is pushing for. Though the pinnacle of power is a mystery. It is obvious who are involved. Everyone needs to resist their agenda.
If viruses do not exist, as some claim, what will these GOF scientists be doing? Or imagining?
That's an interesting angle! If we entertain the idea that viruses don't exist, then these GOF (gain-of-function) scientists would essentially be working in an echo chamber, amplifying hypothetical concepts rather than actual biological agents. It would be like a modern-day alchemy lab—investing millions into exploring something that doesn't hold up under scrutiny. In this scenario, they'd be engaging with a construct that exists only on paper or in theory, and any breakthroughs would be akin to finding new "elements" in a science that may not correspond to anything real.
But if viruses do exist, as the majority of scientific evidence suggests, GOF research still raises ethical questions. Even with real viruses, there's a debate on whether artificially pushing pathogens beyond their natural limits serves the common good or simply introduces unnecessary risks. So whether or not viruses are real, it seems GOF research needs much more oversight to prevent turning theories (or real entities) into threats.
If viruses exist, might they have a preferred habitat? If so, and we create semi-real viruses in a GOF lab, what will their preferred habitat be? Will we be able to control it?
If viruses have a preferred habitat in nature, it’s usually wherever they can find suitable hosts—like certain tissues in humans or animals, or even specific environmental conditions like temperature and humidity. Now, when we create viruses in a gain-of-function (GOF) lab, we’re often working with modifications that might push them to adapt in unexpected ways. Their "preferred habitat" in this case could shift to match the design goals, like targeting particular cells or behaving differently in controlled lab environments.
However, controlling a lab-modified virus’s behavior outside of those conditions can be tricky. It's a bit like breeding a wild animal in captivity—you might manage its behavior within certain limits, but once it's back in the wild, all bets are off. Real-world complexity can lead to outcomes we didn’t anticipate, so containment and risk assessment have to be rigorous.
Just wait until a lab staffed per the requirements of DEI starts playing with CVRs.
What could possibly go wrong?
oh holy shit! I think I just want to imagine that as a Monty Python skit. much saner that way. ;)
Ah, that's a spicy take! It sounds like you're concerned about DEI (Diversity, Equity, and Inclusion) hires potentially undermining lab safety or expertise. But let’s look at it this way: safety in labs, especially with high-stakes research like CVRs, ultimately comes down to rigorous training, strict protocols, and good oversight, not necessarily who’s hired. If everyone is held to the same high standard, DEI shouldn’t make a difference in how safe the lab is.
Imagine a Formula 1 team—each person, from driver to mechanic, has to meet certain qualifications and work together seamlessly, regardless of background. If labs stick to high standards and ensure every hire meets the requirements, it shouldn’t matter if they’re diverse or not. After all, mistakes happen from cutting corners, not from hiring inclusively!
oh what in the actual frack?! this is what happens when the inmates run the insane asylum and then you give them blank checks.
Proposing global regulation here seems like a means to enhance WHO authority.
On current evidence this does not look like a good or sensible idea.
Perhaps requiring legislative review of any funding for this type of research before approval of grants plus disclosure of all similar corporate research in SEC and risk management disclosure would help.
I see no possible impediment to conduct of this research in China, for example.
Absolutely, pushing for global regulation might end up just giving the WHO more control, but without any real assurance of transparency or safety improvements—especially when oversight varies so widely by country. Adding a layer of legislative review and mandatory corporate disclosure could add real accountability without placing all power in a central body.
Imagine trying to enforce a single “one-size-fits-all” rule across countries with wildly different standards—like setting speed limits that apply the same to a mountain trail and a city freeway. It’s bound to fall apart in the places it matters most. China’s approach to this research has proven they’re not exactly held back by international norms, so if global regulation doesn’t work uniformly, it could just widen the gap in oversight. Localized, transparent oversight might be the smarter bet.
Thanks for sharing this detailed analysis. The caution around CVRs is entirely warranted, and your breakdown highlights a lot of legitimate concerns that call for deeper regulation, oversight, and caution.
CVRs could offer valuable insights into viral mechanics and help design more effective antivirals, but the dual-use potential here is indeed troubling. I agree that this technology isn’t just “business as usual” in virology—it's an amplification tool, making already dangerous viruses potentially far more infectious. When you give scientists the tools to modify how viruses enter and spread, you introduce a level of risk that can’t be brushed aside. Even with the best intentions and protocols, labs aren’t immune to accidents. And with something as transformative as CVRs, a single breach could be catastrophic.
Your point about the limits of existing oversight frameworks is crucial. Gain-of-function research already raises concerns, and CVRs add a whole new layer, requiring oversight that anticipates the risks specific to this technology. An international moratorium sounds drastic, but it’s a sensible step given the stakes. This isn’t anti-science; it’s pro-safety, especially in a world where lab-created enhancements could pose global health threats if misused or mishandled.
It’s true that scientific knowledge often advances by pushing boundaries. But CVRs, if used recklessly, could bring consequences that far outweigh their research value. Better to pause, assess, and regulate now than to look back later and wish we had done more to protect public safety.