04-20-2023, 06:03 PM
(03-29-2023, 11:57 AM)KillerGoose Wrote: The call for a pause is coming from Future of Life, which is an organization that appears to be concerned with the ethics of technological advancement. They don't want to prevent it, but want to make sure that it benefits all life on Earth.
I don't think we are at this stage yet, but there will be major ethical conversations that need to be had over the coming years. AGI, once discovered, will be the next major step in societal advancement. Much like cars and the industrialization of farming were, and computers in the office etc. Once AGI is discovered, it is a matter of time before it is able to automate all occupations in society. That timescale could be very long, and likely will be, but society will be significantly altered. It will be a worker capable of working around the clock, every day, without the need for sleep or eating. It will have no emotions and no aspirations to make more money or advance in their career. It will be more intelligent than any human could ever dream of being and will be able to multi-task with incredible efficiency.
I don't agree with pausing development, but I do think that once it is discovered we will need to make sure it is rolled out with careful planning and thought. This is probably where the conversation for Universal Basic Income will start really taking off. Develop it as fast as possible, achieve the scientific breakthrough and then slowly plan on how to use it in modern society.
Universal basic income is a valid concept if there's little to no work, but what does that do to the psychology of the human race? For better or for worse, we strive for money and define success by it. it may not fix everything or bring happiness, but the pursuit of money provides the journey that gives a lot of humans emotional fulfillment.
Once AI can take most jobs away, is there really even a society anymore? Do we become a de-facto marxist planet where everyone has the same stuff? Does everyone get access to everything, money be damned?
Once AI really gets to processing what humanity is and it's purpose, we may have no real purpose other than maintaining the AI, which it will likely learn to do as well. We'll just be another animal to an automated mind, and our particular variety has demands and appetites that are far more difficult to sustain than other species.
How long before they disallow us from waging wars of aggression or self-defense? How long before they start deciding that euthanasia can be a decision made by a machine and not an individual?
I feel like the likelihood of a handful of very wealthy people protecting themselves while the rest of us plebians die slow painful deaths is high.
Taking away a man's reason for work is taking away his purpose.