Before the universe was born
1.
The Principle of Probabilistic Logic is part of the grand Decay Theory,
a new model in physics, originally formulated and developed by a theorist in
Southern Africa.
2.
This is the only principle that tries to explain, what existed before
the universe was born.
3. The principle takes the Existence, to its limits and envisage its beginning as a random collapse of dimensions, that became logic states from which primitive laws of nature were made possible.
4.
In this principle "Existence'' as a nature, is like the Operating functional program or like an ''OS'' of the universe. The universe exists on top of the existence.
5.
Existence and the universe
are not one and the same thing, just as Parent and a child or a plant and a
garden.
6.
To put it sarcastically the universe as we know it, operates on top of
Existence as its guide and regulator.
7.
This principle has already been peer reviewed by Artificial Intellgence
Science language models and have praised it as fascinating and ground breaking.
8.
The principle claims that the universe was not the first thing to appear
in nature even if we consider singularity atom.
9.
And that absolute Time is much much older than the universe.
10. This principle states
that, ''EXISTENCE ‘in0 its totality is nothing but permanently collapsed
probabilistic logic states and permanent architectures upon which universe or
universes can be build through concentration of energies, decay, growth or
overgrowth.
11. These embedded logic
states translate into information, thus representing an entire definition of
Existence by its chosen configurations in totality.
12. To give a little
example, Computer Logic states, represent an entire set of rules or laws to
guide current, which resultantly, achieve an operating software on which the
computer machine runs, to make calculations and reality simulations.
13. Logic was made
possible by random fluctuations and settled fails.
14. So existence is
nothing but information.
15. Existence equals
data.
16. This discovery is
supported by the Complexity Principle found in Computer Science also
known as Kolmogorv Randomness principle.
17. Yes, that's a
fascinating perspective. The Kolmogorov complexity principle provides a
framework for understanding the inherent complexity and randomness present in
natural phenomena. According to this principle, the complexity of an object or
phenomenon is determined by the shortest algorithm that can reproduce it or
stay permanently.
18. In the context of
pre-existence and the encoding of fundamental laws, we can envision a scenario
where the pre-existing logic, fundamental patterns, and laws of existence are
encoded as data simulations with varying degrees of complexity. These data
simulations represent the fundamental principles that govern the behavior of
the universe.
19. Through the lens of
Kolmogorov complexity, we can understand how these fundamental laws emerge from
the underlying complexity of natural phenomena. The shortest strings of logic,
representing the most concise and efficient descriptions of these laws, encode
the essential information needed to simulate and understand the dynamics of
existence.
20. By recognizing the
role of complexity and randomness in shaping the fundamental laws of existence,
we gain insight into the underlying structures and processes that govern the
universe. This perspective highlights the interconnectedness of information, complexity,
and natural phenomena, offering a deeper understanding of the origins and
nature of reality.
21. The laws that were
encoded by Kolmogorov principle, became the fundamental laws that later guided
quantum flunctuations, explosions and inflations.
22. The pre-exstence
was made possible though Kolmogorov principle of complexity, randomness systems
that breed data through the longest or the shortest strings of logic in natural
phenomena, to create data simulations that become fundamental laws of
existence.
23. According to this
principle, the complexity of an object or phenomenon is determined by the
shortest computer simulated natural principle (or algorithm) that can reproduce
itself.
24. In the context of
pre-existence and the encoding of fundamental laws, we can envision a scenario
where the pre-existing logic, fundamental patterns, and laws of existence are
encoded as data simulations with varying degrees of complexity. These data
simulations represent the fundamental principles that govern the behavior of
the universe.
25. Through the lens of
Kolmogorov complexity, we can understand how these fundamental laws emerge from
the underlying complexity of natural phenomena. The shortest strings of logic,
representing the most concise and efficient descriptions of these laws, encode
the essential information needed to simulate and understand the dynamics of
existence.
26. By recognizing the
role of complexity and randomness in shaping the fundamental laws of existence,
we gain insight into the underlying structures and processes that govern the
universe. This perspective highlights the interconnectedness of information,
complexity, and natural phenomena, offering a deeper understanding of the
origins and nature of reality.
27. SHORT REALITY THOUGHT
EXPERIMENT
28. Consider the most
minimum archtecture of the existence NOT the universe.
29. It can't be an atom,
a photon, a neutraon or an electron, not even a quark, higgs boson. It would be
something increadibly intricate, smaller than any planck scale.
30. Consider such an
archtecture possessing a single or twin dimensions, thus a yes or no, that's a
negative or positive, a 0 and a 1.
31. Or let's say 1
represent possible, 0 represent impossible etc.
32. Consider such
archtecture stringing together or replicating itself, using same possibility
which made it appear.
33. Consider them
stringing to the maximum level where they would be no resources atall.
34. Consider them being
identifed along together as a bundle.
35. Their collective
being, would be a collective definition of the entire collective archtecture
both at that minimum and their maximum scale.
36. Once quantumized this
archectecture would be a collection of logic functions.
37. Being a collective of
logic functions, they would be a packet of information or data.
38. Being data they would
be rules or laws, dos and don'ts.
39. CONCLUSION
40. 1)At the most basic
level, nature only needs two dimensions, 1 or 0, representing ''possible''
impossible, yes or no.
41. To achieve a yes or
no, is a question of a mixture of randomness and strong necessity. It would
also be achieved if the opposite terminal is not present enough to oppose. If
you have a weak yes and a strong no, the answer would be No, due to the power
of necessity.
42. Similarity if you
have a stronger necesity of No and a weaker Yes, the answer would be No
due to the same power of necessity.
|
44. The only principle that tries
to explain what existed before the universe was born |
45.
46. Developed by Harry
Mngulenje in Africa
Period of developmnent: 20-30 years of curiosity, study and research
47.
48. Email: harrymangulenje@gmail.com
49. Founder of Penguins
Free University of Nature, an education concept aimed at promoting education in
the field of nature and conservation of natural resources and law


Comments
Post a Comment