نوشته اصلی از سوی
Nilftrondheim
در ساخت ابرهوشواره و در پرسمان تکینگی فندین (technological singularity), برآیند ِ قطعیت ٩٩.٩٩٩٩٩% یکسان است با برایند ِ قطعیت ٠.٠٠٠٠١%.
اهمّیتی ندارد چه اندازه پژوهشگران به کنترل ابرهوشواره نزدیک شده باشند, تا زمانیکه این نزدیکی
١٠٠% نباشد, فرجام آن یکسان است با زمانیکه هیچ گونه سازوکار کنترلکنندهای از پایه پیادهسازی نشده باشد.
پس باری دیگر, اگر امنترین وسیلهیِ نقلیهیِ تاریخ بشر, پس از ٢٠٠≈ سال زمان برای بهکرد و پیشرفت, هنوز نتوانسته به قطعیتی هتّا نزدیک به ١٠٠%
برسد, آنگاه بروشنی ساخت و کنترل ابرهوشوارهای که میلیاردها میلیارد بار هوشمندتر از باهوشترین آدم زنده باشد, چیزی بجز خودفریبی نخواهد بود.
پ.ن.
--
The Importance of getting it right the first time and a centralized economy - Science & Technology of the Future - FutureTimeline.forum
Professor Stephen Hawking, Theoretical Physicist - The Theory of Everything - YouTube
Recently Stephen Hawking gave a talk about AI (linked above) in which he emphasized the importance of
"getting it right the first time" with regard to key technologies such as Artificial Intelligence, Synthetic Biology, and Nanotrchnology. He comes at this from the perspective of the story of intelligence as a story about information. However, in order to control information it cannot be free. It has to be contained somehow with security measures, restricted access, etc. First of all I would like to ask
"Is this even possible?" If you look at the story from the vantage of a network of evolving information, then based on the billions of years of history that we have as evidence we should expect at least two things.
1. Information entropy increases. There is no perfect closed system that can contain it indefinitely. Once it escapes it can replicate very rapidly.
2. The diversity and variety of information should continue to increase. The increase in entropy of information seems to be closely related to what we call intelligence. To try to curtail the process of increasing information entropy is to squelch the very intelligence we wish to create.
Secondly, in terms of a more pragmatic perspective. What political and economic systems would be necessary for a species to control these key technologies enough to ensure survival?
I don't think it would be capitalism because for example the short term profit in creating autonomous weapons systems would generally seem to outstrip the potential existential threat it creates such as we are seeing with the global warming bullshit from industry and the possible ramifications of genetically modified foods. Hawking directly references the need to curtail the weaponization of autonomous systems.
Wouldn't political and economic control have to become more centralized in order to control the spread of information regarding these key technologies in order to ensure survival?