Difference between revisions of "Dangers"

From apm
Jump to: navigation, search
m (List of possible dangers that could arise with the rise of advanced AP technology: techdebt bolt doo)
m (Related)
 
(21 intermediate revisions by the same user not shown)
Line 11: Line 11:
  
 
== List of possible dangers that could arise with the rise of advanced AP technology ==
 
== List of possible dangers that could arise with the rise of advanced AP technology ==
 +
 +
=== WASTE – perhaps the biggest (and most underrated) risk of them all ===
  
 
* '''non degrading waste [[recycling]] (envirounment)'''
 
* '''non degrading waste [[recycling]] (envirounment)'''
 +
Waste related:
 +
* [[Spill]] (Related: [[Spill avoidance guideline]])
 +
* [[sharp edges and splinters]] (health)
 +
 +
=== Software problems become even more physical ===
 +
 
* '''building up on flawed software design ([[wikipedia:Technical debt]])'''
 
* '''building up on flawed software design ([[wikipedia:Technical debt]])'''
----
+
* loss of unmaintained data (older building plans) due to dependence on giant library tree that changes and is not completely backed up (today most visible in the problem of archiving historic computer games) - "neo ephemeralism"
 
* vulnerability to malicious software. (See: [[self limitation for safety]], ..)
 
* vulnerability to malicious software. (See: [[self limitation for safety]], ..)
 +
* undermining of the basis for [[both gratis and free open source hardware]] to exist due to fear-based over-regulation
 +
 +
=== Making even more and much worse of a mess when trying to fix our old messes ===
 +
 
* risky [[geoengineering]] going wrong.
 
* risky [[geoengineering]] going wrong.
* development of new [[weaponary]] (e.g. [[interfacial drive]] based kinetic energy weapons)
+
* negative side effects of attempted cleanup after nuclear accidents (finely dispersed nanorobotic devices in the environment violating the [[Mobility prevention guideline|"keep in machine phase guideline"]])
 +
 
 +
=== Bad old chemisty risks times a million ===
 +
 
 +
Dangerous substances:
 
* easy production of explosives (e.g. solid sp<sup>3</sup> nitrogen, solid quartz like carbon dioxide)
 
* easy production of explosives (e.g. solid sp<sup>3</sup> nitrogen, solid quartz like carbon dioxide)
 
* easy production of simple [[poisons]] (e.g. HCN)
 
* easy production of simple [[poisons]] (e.g. HCN)
* easy production of drugs (hard for [[mechanosynthesis|diamondoid mechanosynthesis]] though since it's basic form can't deal well with complex floppy molecules - see: [[synthesis of food]])
+
* easy production of drugs (may be hard for [[piezochemical mechanosynthesis]] though since it's basic form can't deal well with complex floppy molecules - see: [[synthesis of food]])
* [[sharp edges and splinters]] (health)
+
 
* [[social and economic dangers]] like detrimental effects of [[rapid and imbalanced economic change]]
+
=== Risks with politics and dangerous philosophies ===
* [[social and economic dangers]] like globally [[declining birth rates]] at a dangerous rate
+
 
* [[the grey goo meme|uncontrolled replication]] -- [[reproduction hexagon]]
+
* negative aspects of even more excessive surveillance that what already starts to transpire with current day technology (2021)
 +
* '''development of new [[weaponry]] (e.g. [[interfacial drive]] based kinetic energy weapons)'''
 
* an arms race ?
 
* an arms race ?
* unethical perfectionism
+
 
* negative side effects of attempted cleanup after nuclear accidents (finely dispersed nanorobotic devices in the environment violating the [[Grey_goo_meme#replicativity|"keep in machine phase guideline"]])
+
=== Extreme right wing horror ===
* negative aspects of excessive surveilance
+
 
* undermining of the possibility of [[gratis open source hardware]] due to fear based overregulation
+
Unethical perfectionism or whatever the actors perceive to be "perfect". <br>
* loss of unmaintained data (older building plans) due to dependence on giant library tree that changes and is not completely backuped (today most visible in the problem of archiving historic computer games) - "neo ephemeralism"
+
"Strength" as the single prime goal. <br>
* ... ['''Todo:''' add further ones]
+
Usually associated with the extreme political right (oversimplifying things).
 +
 
 +
Note that [[gemstone metamaterial technology]] is definitely not about building some crazy military stuff <br>
 +
(robotic super-soldies or horrific killer drone-swarms or whatever) <br>
 +
It's just a neutral but immensely powerful technology. <br>
 +
A bit like "fire" (but much more difficult to attain) it can be used for both good and bad.
 +
 
 +
There are plenty of positive ways to use this technology in positive ways. (See: [[Opportunities]]) <br>
 +
Also there is
 +
* a focus on making and keeping stuff [[recycling|recyclable]] and where needed there is even
 +
* a focus on making products as '''deliberately weak and degradeable structures'''
 +
Examples:
 +
* There are (bio)degradable (semi)gemstones. Like [[periclase]] is somewhat water soluble and other gemstones even more so.
 +
* It will be desirable to design integrated intended breakage functionalities in strings and ropes and other stuff to prevent accidents
 +
 
 +
==== When something potentially extremely good turns into something pretty bad just because it happens too fast and is mismanaged ====
 +
 
 +
* [[social and economic dangers]] like detrimental effects of [[rapid and imbalanced economic change]]
 +
* [[social and economic dangers]] like globally [[declining birth rates]] at a dangerous rate – (Material wealth tends to decrease birth rates.)
 +
 
 +
=== Grey goo – perhaps a massively overrated risk  ===
 +
 
 +
* [[the grey goo meme|uncontrolled replication]] (toned down to more realistic levels) <br>(Related: [[reproduction hexagon]] and [[Mobility prevention guideline]])
  
 
== Waste ==
 
== Waste ==
Line 102: Line 141:
 
[[Synthetic biology]] is still an interesting and potentially valuable research. No attempt on discreditation here. But:  
 
[[Synthetic biology]] is still an interesting and potentially valuable research. No attempt on discreditation here. But:  
 
* [[Synthetic biology]] is is not obviously directly relevant for getting to [[gem-gum tech]] ASAP.
 
* [[Synthetic biology]] is is not obviously directly relevant for getting to [[gem-gum tech]] ASAP.
* [[Synthetic biology]] can eventually bring bio-hazard risks (of hard to predict magnitude) while <br>such risk for [[near term pathway technology]] that is focussedly targeting [[gem-gum tech]] is [[for all practical purpouses]] zero. Its increasingly stiff and dry construction brick blocks that are increasingly incompatible to biology after all.
+
* [[Synthetic biology]] can eventually bring bio-hazard risks (of hard to predict magnitude) while <br>such risk for [[near term pathway technology]] that is focussedly targeting [[gem-gum tech]] is [[for all practical purposes]] zero. Its increasingly stiff and dry construction brick blocks that are increasingly incompatible to biology after all.
  
 
Taking the path back to to make [[gem-gum technology]] compatible with biology again will take a lot of focussed effort. See [[gem-gum nanomedicine]].
 
Taking the path back to to make [[gem-gum technology]] compatible with biology again will take a lot of focussed effort. See [[gem-gum nanomedicine]].
That is a topic that lies even beyond the (in comparison simple) far term target of [[gem-gum factories]].  
+
That is a topic that lies even beyond the (in comparison simple) far term target of [[gem-gum factories]].
+
 
 
==== R Radiological and N Nuclear ====
 
==== R Radiological and N Nuclear ====
  
Line 120: Line 159:
 
but we won't go into such an overclassification rabbithole on this wiki unless it becomes really necessary.
 
but we won't go into such an overclassification rabbithole on this wiki unless it becomes really necessary.
  
== Controversial topics and things hard to talk about ==
+
== Controversial topics that can't be talked about without being caught in some middle ==
  
* Borderline SciFi concepts like: [[Transhumanism]] and [[Mind uploading]] - this links to: ethics, religions, core beliefs, all highly emotionally ladden
+
* Borderline SciFi concepts like: [[Transhumanism]] & [[Mind uploading]] <br>– this links to: ethics, religions, core beliefs, all highly emotionally laden
* Fission nuclear technology (especially near term on earth)- you know why ...
+
* Highly controversial details about [[governance]] on all scales
* Details about [[governance]] (it's the superset of politics) - who would have thought ...
+
 
* Who are "the bad guys" really? Especially in commercial and political interest contexts this is often far from clear cut.
 
* Who are "the bad guys" really? Especially in commercial and political interest contexts this is often far from clear cut.
* In more clear cut contexts: How much bad behavior can be excused on basis of unfortunate personal histories? Or rather how much can we afford to excuse?
+
* In more clear cut contexts: How much bad behavior can be excused on basis of unfortunate personal histories? Or rather how much can afforded to be excused?
 +
* Less severe: Stance on fission nuclear technology - repeating accidents have broken trust - strong pro & strong counter factions (done on Earth of sent from Earth)
 
* ...  
 
* ...  
  
And then there are those topics that would be dumb to even mention publicly. <br>
+
=== Far beyond only the usual suspects ===
Claim: Every sufficiently intellectual, curious and active person knows some of these. <br>
+
 
And no, it's not the usual broadly known suspects like human trafficking of poor and unlucky <br>
+
There are topics that one should not or one might not want to talk about openly. <bR>
partly under-aged humans for sexual abuse and worse purposes. <br>
+
As discussing these topics openly likely endangers ones own real-name-reputation and possibly even ones physical safety. <bR>
These can and sometimes are publicly discussed. To a degree.
+
– This absolutely does not mean that ones opinions on the topic must be immoral / unethical / despicable <br>
 +
– This includes topics far beyond the usual suspects. Sometimes in intellectual areas few are even aware of. <bR>
 +
 
 +
=== Don't stare in to hell for too long. It sucks! you in. ===
  
 
Thinking too much about all the potentially bad things that could be committed or could happen by accident or nature <br>
 
Thinking too much about all the potentially bad things that could be committed or could happen by accident or nature <br>
 
can be a dangerous psychological down-spiral. <br>
 
can be a dangerous psychological down-spiral. <br>
Hell has no bottom and it want's to suck you in when you peer to hard down the abyss of true horror, so better don't.<br>
+
Hell has no bottom and it want's to suck you in when you peer to hard down the abyss of true horror. So better don't.<br>
Suggestion: If in any way possible take breaks occupying yourself with more positive topics.
+
If in any way possible take breaks occupying yourself with more positive topics.
  
When thinking openly and publicly about what bad agents could do. <br>
+
=== The dangerous-tech hide-or-share dilemma ===
 +
 
 +
When thinking openly and publicly about what malicious agents could attempt. <br>
 
When thinking with the good intent to prevent and prepare for potential harmful actions of dangerous actors, then (if done carelessly)<br>
 
When thinking with the good intent to prevent and prepare for potential harmful actions of dangerous actors, then (if done carelessly)<br>
it may actually give these bad agents inspirations and help in developing these bad tings. <br>  
+
it may actually give these malicious agents inspirations and help in developing bad tings. <br>  
It's the daily bread of ethical hackers, judging and juggling this dilemma. <br>
+
On the other hand keeping insight secret might prevent more collaboration on preemptive countermeasures. <br>
 +
This is a dilemma currently 2023 still mostly faced in software by "ethical hackers"/"white hat hackers". <br>
 +
But it might increasingly apply to hardware too.
 +
 
 +
The dilemma in brief:
 +
* public: | - inspiring bad actors | + allowing more collaboration on countermeasures
 +
* secret: | + bad actors need to figure it out themselves | - less collaboration on countermeasures
 +
 
 
It seems for now the far term [[gemstone metamaterial technology]] is still far enough away that most things <br>  
 
It seems for now the far term [[gemstone metamaterial technology]] is still far enough away that most things <br>  
 
(that may eventually become critical in this regard) can still be discussed in the fully public open. <br>
 
(that may eventually become critical in this regard) can still be discussed in the fully public open. <br>
Once this era ends the secrets themselves can become the source of problems.  
+
Once this era ends the secrets themselves can become the source of problems. <br>
 +
 
 +
Actually with de-novo protein nanotech the still high level of proprietaryness in biotech <br>
 +
might give a push in the more secretive direction.
 +
 
 +
=== Pseudonymity potentially enabling the helping in deescalation of extreme conflicts war-fronts ===
 +
 
 +
Given topics with a really wide hostile rift between two group-thing opinion sides, <br>
 +
it can be a really bad idea to put oneself non-pseudonymously in the middle in order to try to de-escalate the situation. <br>
 +
Both sides are likely to shun and perhaps even attack one for friendly interacting with the other side respectively. <br>
 +
Trying to bend the narrative towards the middle from the other side too. One might be shunned and attacked as a traitor.<br>
 +
One might up in the middle between the more or less proverbial fronts of war and get attacked from both sides. <br>
 +
Technical support and permission for usage of pseudonyms can help in attempting bilateral deescalation without taking on excessive personal risks. <br>
 +
Pseudonyms uses as throwaway sock-puppets though are clearly an abuse of them. <br>
  
 
{{wikitodo|this section needs its own main page}}
 
{{wikitodo|this section needs its own main page}}
Line 152: Line 216:
 
== Related ==
 
== Related ==
  
 +
* [[Opportunities]]. As complementary page to page: [[Dangers]]
 +
----
 
* [[Disaster proof]]
 
* [[Disaster proof]]
 +
----
 +
'''Easy to produce in great quantity and higly dangerous:'''
 +
* [[Poisons]] and (maybe to lesser degree as more complex molecules: [[Drugs]])
 +
* [[Explosives]] – metastable very high energy compounds like e.g. sp3 solid nitrogen.
 +
* Diamondoid dust. Accidental inhalation may lead to to silicosis like symptoms. <br>See: [[Spill avoidance guideline]]. But no one cares in war. The degree of the problem depends on the type of base material gemstone used. Water soluble gems pose less of a problem.
 +
----
 +
'''Truly terrifying:'''
 +
* [[Gemstone metamaterial based weapons]]
 +
* Rouge airborne [[microcomponent maintenance microbot]]s as stealthy as pollen. <br> Very advanced technology level. Product surfaces featuring combination locks may be a countermeasure that eventually becomes necessary.
 +
* Those airborne micro-robots (not nanobots) acting as harmful diamondoid dust and set free in high quantities as no one cares about pollution and spill in war.
 +
----
 +
'''In most cases greatly overestimated threats:'''
 +
* [[Grey goo horror fable]], [[Reproduction hexagon]], [[Replication pentagon]]
 +
----
 +
'''Autonomous independent local (at home!) nuclear proliferation of rouge individuals:''' <br>
 +
* people secretly building their own nuclear small scale in the basement reactors (no ill intent but bad outcome likely)
 +
* people secretly making dirty bombs at home
 +
* people secretly making nuclear bombs at home
 +
This worry comes up as as isotope sorting no longer needs giant factories with centrifuges. See: [[Isotope sorting]]<br>
 +
This still needs notable amounts of ore. Also such systems implementation code is nowhere near as trivial as for making poisons and explosives. <br>
 +
Less easy to hack one together only form other individually innocuous openly accessible code components. <br>
 +
Related: [[APM and nuclear technology]]
  
 
== External links ==
 
== External links ==
  
* [[https://en.wikipedia.org/wiki/Weapon_of_mass_destruction Weapon of mass destruction]]
+
* [https://en.wikipedia.org/wiki/Weapon_of_mass_destruction Weapon of mass destruction]
* [[https://de.wikipedia.org/wiki/Massenvernichtungswaffe Weapon of mass destruction (german page)]] from ABC-Weapons now CBRN-Weapons
+
* [https://de.wikipedia.org/wiki/Massenvernichtungswaffe Weapon of mass destruction (german page)] from ABC-Weapons now CBRN-Weapons
  
 
[[Category:Technology level III]]
 
[[Category:Technology level III]]

Latest revision as of 09:23, 12 August 2024

This is a condensed down list of the worst things atomically precise manufacturing and technology could be misused for.
The reader might want to not take in all at once and take a brake with more optimistic outlooks.

Note that one might argue that both the worst nightmares and the brightest utopias are both extreme cases that are extremely unlikely to come true in their full extent. The reality most likely lies somewhere in-between. The question is: How far can we push civilization from the generally considered bad stuff to the generally considered good stuff. The opportunities that atomically precise technology will bring should hopefully significantly outbalance the dangers presented here.

A word of warning: In many cases panic, alarmism and ensuing overreaction (e.g. total bans) with too limited understanding of the full breath of the situation can have worse effects than the danger would have brought if no action had been taken at all. This by itself could be added to the dangers (a "meta danger").

Every technology powerful enough to bring significant advances always comes with dangers involved too.
The plethora and level of scaryness of the dangers listed here is just one place where the (neutral and impartial) power of atomically precise manufacturing technology shows.

List of possible dangers that could arise with the rise of advanced AP technology

WASTE – perhaps the biggest (and most underrated) risk of them all

  • non degrading waste recycling (envirounment)

Waste related:

Software problems become even more physical

  • building up on flawed software design (wikipedia:Technical debt)
  • loss of unmaintained data (older building plans) due to dependence on giant library tree that changes and is not completely backed up (today most visible in the problem of archiving historic computer games) - "neo ephemeralism"
  • vulnerability to malicious software. (See: self limitation for safety, ..)
  • undermining of the basis for both gratis and free open source hardware to exist due to fear-based over-regulation

Making even more and much worse of a mess when trying to fix our old messes

Bad old chemisty risks times a million

Dangerous substances:

  • easy production of explosives (e.g. solid sp3 nitrogen, solid quartz like carbon dioxide)
  • easy production of simple poisons (e.g. HCN)
  • easy production of drugs (may be hard for piezochemical mechanosynthesis though since it's basic form can't deal well with complex floppy molecules - see: synthesis of food)

Risks with politics and dangerous philosophies

  • negative aspects of even more excessive surveillance that what already starts to transpire with current day technology (2021)
  • development of new weaponry (e.g. interfacial drive based kinetic energy weapons)
  • an arms race ?

Extreme right wing horror

Unethical perfectionism or whatever the actors perceive to be "perfect".
"Strength" as the single prime goal.
Usually associated with the extreme political right (oversimplifying things).

Note that gemstone metamaterial technology is definitely not about building some crazy military stuff
(robotic super-soldies or horrific killer drone-swarms or whatever)
It's just a neutral but immensely powerful technology.
A bit like "fire" (but much more difficult to attain) it can be used for both good and bad.

There are plenty of positive ways to use this technology in positive ways. (See: Opportunities)
Also there is

  • a focus on making and keeping stuff recyclable and where needed there is even
  • a focus on making products as deliberately weak and degradeable structures

Examples:

  • There are (bio)degradable (semi)gemstones. Like periclase is somewhat water soluble and other gemstones even more so.
  • It will be desirable to design integrated intended breakage functionalities in strings and ropes and other stuff to prevent accidents

When something potentially extremely good turns into something pretty bad just because it happens too fast and is mismanaged

Grey goo – perhaps a massively overrated risk

Waste

Albeit advanced APM has the potential to be an absolutely clean technology in production (since its primary waste products are only hot air and warm water) advanced APM would be a unprecedented wasteful technology in disposal if not built correctly since the products themselves must be considered waste once they become obsolete - and the products will become obsolete fast as we can see with today's pace of software improvement. Also the global production rate in mass or volume will be bigger since production will become so widespread and accessible.

  • Recycling is not an option its an obligation that has to be thought of before creating advanced APM systems that can produce products that do not biodegrade in reasonable timespans.

Related to the waste problem

  • recomposable microcomponents
  • The two recycling classes of products: The ones that can be completely burned to gasses (C,H,O,N,S) &
    The ones that produce slacks when burned (containing too: Si,Al,Ti,Fe,Na, and the wole remaining periodic table). See: Diamondoid waste incineration.
  • amorphous slack of various elements is hard to deal with since blind atomically precise disassembly of unknown structures is a very hard problem.
  • there's no AP disassembly (at least its a lot harder to do than AP assembly)

Possible classification

old:ABC vs new:CBRN

Please do not use the old (and outdated) classification classification ABC.
ABC was standing for: Atomic-hazard, Biohazard, Chemical-hazard.
Why? Because:

  • The benefit of easy remembrance does not outweigh
  • the didactic damage it is capable to do and that it may have done.

Todays (2021 and before) experts want to see

  • the old acronym ABC being superseded by
  • the new acronym CBRN.

CBRN standing for: Chemical, Biological, Nuclear, Radiological.
CBRN is much less mnemonically helpful than ABC so opposing the use of the old its use may be a bit of a fight against windmills.
Still acronym replacement is pushed because but it stops a large group of non technical people from
confusing atoms with nuclei and consequently from making ill informed judgements e.g. when voting.

About the current (2021-04) English wikipedia pages abut "weapons of mass destruction":

  • the English page does not even mention the old acronym and why it is so bad. Bad.
  • the German page has a bold note on the old acronym right in the second paragraph of the intro. Better.

This may also be especially relevant for the German language room because it seems that
there people still seem to preferably call nuclear power-plants by the misleading name "atomic power-plants".

The unfortunately still common confusion between atoms and nuclei is also a problem for the
now newest adopted title for the technology that this wiki is about.
That is "atomically precise manufacturing (and technology)"

A better term may have been "chemical bond precise manufacturing (and technology)"
but unfortunately in the book "Radical Abundance" the term "atomically precise manufacturing (and technology)"
was introduced the atom nucleus confusion trapdoor was not averted by the author.

C Chemical

Poisons are already quite simple to make in high quantities with current day technology.
Gem-gum factories would make that even easier if no smart regulations at all will be included.
See main page: Poisons

B biological

Today (2021):
Mainly referring to selective breeding and/or genetic engineering of infectious pathogenic agents.
Usually not referring to unhealthy effects from food originating from genetic engineering (and/or selective breeding).

As for the technology tackled in this wiki discussed in this wiki:

What may come with biohazardous risks (of hard to predict magnitude) is a technology development path that
is not at all focussedly targeting towards the stiff nanosystems of gemstone metamaterial technology.
A technology pathway where people want to recreate the soft nanomachinery of life in a similar fashion as it already exists
Which is very much not the topic of this wiki.
We are talking about soft and floppy artificial membranes, vesticles and such here.
This technology is called synthetic biology.
Synthetic biology is still an interesting and potentially valuable research. No attempt on discreditation here. But:

Taking the path back to to make gem-gum technology compatible with biology again will take a lot of focussed effort. See gem-gum nanomedicine. That is a topic that lies even beyond the (in comparison simple) far term target of gem-gum factories.

R Radiological and N Nuclear

See main article: APM and nuclear technology
There's also the risk of abuse of X-rays from ultracompact optical particle accelerators.

Why not adding a "nanotechnological risk" to the list

Because it's to general.
This would be analogous to saying "macrotechnological risk".
Instead a prefix could be added like: nC, nB, nR, nN
But that's still to ambiguous. One could add the technology level like: n1C, n2C, n3C
but we won't go into such an overclassification rabbithole on this wiki unless it becomes really necessary.

Controversial topics that can't be talked about without being caught in some middle

  • Borderline SciFi concepts like: Transhumanism & Mind uploading
    – this links to: ethics, religions, core beliefs, all highly emotionally laden
  • Highly controversial details about governance on all scales
  • Who are "the bad guys" really? Especially in commercial and political interest contexts this is often far from clear cut.
  • In more clear cut contexts: How much bad behavior can be excused on basis of unfortunate personal histories? Or rather how much can afforded to be excused?
  • Less severe: Stance on fission nuclear technology - repeating accidents have broken trust - strong pro & strong counter factions (done on Earth of sent from Earth)
  • ...

Far beyond only the usual suspects

There are topics that one should not or one might not want to talk about openly.
As discussing these topics openly likely endangers ones own real-name-reputation and possibly even ones physical safety.
– This absolutely does not mean that ones opinions on the topic must be immoral / unethical / despicable
– This includes topics far beyond the usual suspects. Sometimes in intellectual areas few are even aware of.

Don't stare in to hell for too long. It sucks! you in.

Thinking too much about all the potentially bad things that could be committed or could happen by accident or nature
can be a dangerous psychological down-spiral.
Hell has no bottom and it want's to suck you in when you peer to hard down the abyss of true horror. So better don't.
If in any way possible take breaks occupying yourself with more positive topics.

The dangerous-tech hide-or-share dilemma

When thinking openly and publicly about what malicious agents could attempt.
When thinking with the good intent to prevent and prepare for potential harmful actions of dangerous actors, then (if done carelessly)
it may actually give these malicious agents inspirations and help in developing bad tings.
On the other hand keeping insight secret might prevent more collaboration on preemptive countermeasures.
This is a dilemma currently 2023 still mostly faced in software by "ethical hackers"/"white hat hackers".
But it might increasingly apply to hardware too.

The dilemma in brief:

  • public: | - inspiring bad actors | + allowing more collaboration on countermeasures
  • secret: | + bad actors need to figure it out themselves | - less collaboration on countermeasures

It seems for now the far term gemstone metamaterial technology is still far enough away that most things
(that may eventually become critical in this regard) can still be discussed in the fully public open.
Once this era ends the secrets themselves can become the source of problems.

Actually with de-novo protein nanotech the still high level of proprietaryness in biotech
might give a push in the more secretive direction.

Pseudonymity potentially enabling the helping in deescalation of extreme conflicts war-fronts

Given topics with a really wide hostile rift between two group-thing opinion sides,
it can be a really bad idea to put oneself non-pseudonymously in the middle in order to try to de-escalate the situation.
Both sides are likely to shun and perhaps even attack one for friendly interacting with the other side respectively.
Trying to bend the narrative towards the middle from the other side too. One might be shunned and attacked as a traitor.
One might up in the middle between the more or less proverbial fronts of war and get attacked from both sides.
Technical support and permission for usage of pseudonyms can help in attempting bilateral deescalation without taking on excessive personal risks.
Pseudonyms uses as throwaway sock-puppets though are clearly an abuse of them.

(wiki-TODO: this section needs its own main page)

Related



Easy to produce in great quantity and higly dangerous:

  • Poisons and (maybe to lesser degree as more complex molecules: Drugs)
  • Explosives – metastable very high energy compounds like e.g. sp3 solid nitrogen.
  • Diamondoid dust. Accidental inhalation may lead to to silicosis like symptoms.
    See: Spill avoidance guideline. But no one cares in war. The degree of the problem depends on the type of base material gemstone used. Water soluble gems pose less of a problem.

Truly terrifying:

  • Gemstone metamaterial based weapons
  • Rouge airborne microcomponent maintenance microbots as stealthy as pollen.
    Very advanced technology level. Product surfaces featuring combination locks may be a countermeasure that eventually becomes necessary.
  • Those airborne micro-robots (not nanobots) acting as harmful diamondoid dust and set free in high quantities as no one cares about pollution and spill in war.

In most cases greatly overestimated threats:


Autonomous independent local (at home!) nuclear proliferation of rouge individuals:

  • people secretly building their own nuclear small scale in the basement reactors (no ill intent but bad outcome likely)
  • people secretly making dirty bombs at home
  • people secretly making nuclear bombs at home

This worry comes up as as isotope sorting no longer needs giant factories with centrifuges. See: Isotope sorting
This still needs notable amounts of ore. Also such systems implementation code is nowhere near as trivial as for making poisons and explosives.
Less easy to hack one together only form other individually innocuous openly accessible code components.
Related: APM and nuclear technology

External links