Crypto News From Lab To Ledger Human Keys Secure Scientific Integrity Option01.webp.webp

Human keys reinforce scientific integrity


Disclosure: The views and opinions expressed here are those of the author alone and do not represent the editorial views and opinions of crypto.news.

The speed at which AI is increasing management standing a danger to data, identity, and reputation verification and, if left unchecked, could increase the incidence of misinformation and slow the progress of scientific innovation. The march towards highly intelligent AI is represented by its strongest leaders as a push towards a scientific golden age. However, this push increases the chances that our society is at risk of hitting a disruptive technical plateau where widespread adoption of AI technology caps immature and, over time, reduces creativity and human innovation.

This is a different idea for most accelerators. AI is supposed to improve our ability to work faster and synthesize more information. However, AI cannot place inductive reasoning or the experimental process. Today, anyone can use AI to make a scientific hypothesis and use that as an aid generate scientific paper. The results of products like Aithor often appear authoritative on the surface and may pass peer review. This is a big problem because AI-generated texts are already held up as legitimate scientific results and often include fake artificial data to support their claims. . There is a strong incentive for young researchers to use whatever means they have to compete for a limited number of academic jobs and funding opportunities. The current incentive system in academia rewards those who can publish the most papers, whether or not those papers report valid findings – they must pass a -peer review and get enough citations.

Academic content with unverified authorship will also be an important issue for businesses dependent on basic science to power their research and development, the very R&D that keeps our society functioning and sustains the quality of life for a growing global population. As a result, well-funded R&D cannot rely on the research it can conduct and reproduce on its own, increasing the value of trade secrets and dealing a devastating blow to open science and access to information. meaningful

Expensive replication efforts can deal with misinformation alone, however, the problem is much bigger than that. Today, we face a the erosion of trust in the very foundations of knowledge, where unproven claims and questionable attributes undermine scientific advances, are a threat to the scientific community. There is a dire need to establish a reality-based economy to confirmation content and data reliably.

AI systems are only as powerful as the data they are trained on

Large language modules are excellent tools for generating convincing content; however, they are only as informative as the data they are trained on. They still have the ability to extrapolate outside of the training set remains limited. Science is not only about synthesizing existing knowledge but creating new knowledge materials that add to the entropy of the accumulated body of knowledge. 'humanity. Over time, as more people use AI to generate content and fewer people generate original content, we will face “low-entropy bloat” that will only introduce new information into the world. just repeats past experience. Sources will be lost as new “knowledge” is based on self-referential content generated by AI if we do not build a robust provenance and proof-of-concept level into AI tools used for real research.

This “lobotomization” of the intellectual depth of the collective human corpus will have a lasting impact on medical, economic and academic research as well as the arts and creative pursuits. Unconfirmed data can influence studies, distort outcomes and lead to significant policy or technological failures that erode the authority of scientific research. The dangers of AI-generated “science” are manifold. Ongoing science activities will be normal stall on authorship debates, plagiarism allegations, and review with disabilities. We need to devote more time and energy to dealing with the many consequences that come from it shrinking the quality and accuracy of scientific research.

AI is a useful tool for stimulating ideas, structuring thoughts, and automating repetitive tasks; it must remain in support of human-generated content and not in place of it. It should not be used to write scientific papers that propose original conclusions without doing the work but as an aid to increase the efficiency and accuracy of human-led efforts. For example, AI can be helpful in running simulations on existing data with already known methods and automating this work to help find leads new research. However, the experimental protocol and human creativity necessary for scientific investigation cannot be easily replaced.

Building a reality-based economy

A truth-based economy establishes a framework with systems and standards to ensure information and data truth, integrity, transparency and traceability. It addresses the need to establish trust and authentication across a technological society, allowing individuals and organizations to rely on the accuracy of shared knowledge. Value is rooted in the truth of claims and the authenticity of ideas and primary sources. A reality-based economy will make digital knowledge “hard” in the way that Bitcoin made fiat hard. This is the promise of the decentralized science to move

How do we get there? We must start with the most important element in the scientific world, the individual researcher and their work. Current web standards for scientific identity today fall short of validating claims of identity and proof of work. Conventional practice makes it very easy to create an image with a malleable reputation; peer reviews are also vulnerable to bias and prejudice. Without verification of the metadata accompanying a scientific application, a truth-based economy for science cannot be established.

Improving academic identity standards can start with a simple cross-platform login powered by privacy-preserving identity verification technology. Users should be able to log into any site with their credentials, verify authenticity, and selectively reveal facts about their reputation, data, or other agents or users.

A degree of identity rooted in the reputation of a proven researcher is the foundation of DeSci. A fully on-chain scientific economy will allow public and anonymous participation in large-scale online coordination of research activities. Research labs and decentralized self-regulatory organizations can create permissionless systems and bounty programs that cannot be gamed by fake reputation or identity claims. A secure universal scientific record on a blockchain with identity claims would provide a reference framework for autonomous bodies built to gather proven scientific knowledge and test false hypotheses.

Protect future human progress

We need to establish the foundations of truth through information transparency and rigorous verification to avoid a breach of trust within expert research fields. The chances that our collective progress will continue for hundreds more years, unlocking ongoing scientific breakthroughs in materials science, biotechnology, neuroscience, and complexity science, will depend on ' treatment of quality research and sound data. This will be the difference between a future society that is as advanced as we are compared to pre-enlightenment societies. Otherwise, we have to expect that this is as smart as we can get as a species, and we will only get dumber. Whether or not DeSci will save us is unclear, but there is limited time to get things right.

Shady El Damaty

Shady El Damaty

Shady El Damaty co-founder of the Holonym Foundation, seeking a solution for universal identity and safe digital access with a decentralized identity protocol built on the magic of zero-knowledge authentication. In 2020, he built OpSci, the first decentralized science organization, or DeSci for short. Before starting his career in crypto, Shady earned his PhD in neuroscience from Georgetown University, Washington, DC, United States.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *