SINGULARITY

SINGULARITY

This article is about the mathematical /physical concept of singularity. For the Omni-Philosophy see: Rule of One

In mathematics,
A singularity is in general a point at which a given mathematical object is not defined, or a point of an exceptional set where it fails to be well-behaved in some particular way, such as differentiability.

For example, the real function

{\displaystyle f(x)={\frac {1}{x}}} f(x)={\frac  {1}{x}} has a singularity at x = 0, where it seems to "explode" to ±∞ and is not defined. The function g(x) = |x| (see absolute value) also has a singularity at x = 0, since it is not differentiable there.

The algebraic curve defined by {\displaystyle \{(x,y):y^{3}-x^{2}=0\}} {\displaystyle \{(x,y):y^{3}-x^{2}=0\}} in the (x, y) coordinate system has a singularity (called a cusp) at (0, 0). See Singular point of an algebraic variety for details on singularities in algebraic geometry. See Singularity theory for singularities in differential geometry.

The technological singularity,
(also, simply, the singularity)[1] is the hypothesis that the invention of artificial superintelligence (ASI) will abruptly trigger runaway technological growth, resulting in unfathomable changes to human civilization.

According to this hypothesis, an upgradable intelligent agent (such as a computer running software-based artificial general intelligence) would enter a "runaway reaction" of self-improvement cycles, with each new and more intelligent generation appearing more and more rapidly, causing an intelligence explosion and resulting in a powerful superintelligence that would, qualitatively, far surpass all human intelligence.

The first use of the concept of a "singularity" in the technological context was John von Neumann. Stanislaw Ulam reports a discussion with von Neumann "centered on the accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue". Subsequent authors have echoed this viewpoint.

I. J. Good's "intelligence explosion" model predicts that a future superintelligence will trigger a singularity.

The concept and the term "singularity" were popularized by Vernor Vinge in his 1993 essay The Coming Technological Singularity, in which he wrote that it would signal the end of the human era, as the new superintelligence would continue to upgrade itself and would advance technologically at an incomprehensible rate. He wrote that he would be surprised if it occurred before 2005 or after 2030.

Four polls, conducted in 2012 and 2013, suggested that the median estimate was a 50% chance that artificial general intelligence (AGI) would be developed by 2040–2050.

UNION Reality
Word War III stifled development of a true AI on Earth and was accelerated after the ascent in 2098 OTT. It lead to the "Super Computer Crisis " and the creation of the AI laws. Modern AIs are technically more intelligent than most beings, but fail safe mechanisms prevent AIs from becoming sentient. Mothermachine being the largest and most intelligent known AI (with Self Awareness)