The rise of AI and threat to the future of real science
Opinion
By
Dr Xavier Musonye
| Jan 17, 2026
Throughout the history of civilisation, technological evolution has largely improved human life. From the control of fire and invention of the wheel to steam engine and computer mainframes, each breakthrough has propelled society forward. Yet every leap has been accompanied by anxiety.
Literature captured these fears memorably through Mary Shelley’s Frankenstein, where a human-made creation escapes its maker’s control. As artificial intelligence rapidly evolves, similar unease is resurfacing particularly within scientific and academic inquiry.
In what many nostalgically call the “golden era” of scholarship, research was slow, deliberate, and deeply human. Each source demanded careful reading to understand its methodology, conclusions, and the unanswered questions it left behind. The internet later simplified access, but comprehension still required effort and intellectual engagement.
Digital tools played a supportive role, helping with grammar or clarity rather than content. Knowledge advanced through discipline, not shortcuts. That rhythm changed dramatically in 2022, when AI text generators such as ChatGPT became accessible.
Research, once measured in months or years, could now be produced in minutes with social media causing an “invasion of idiots into public spaces,” academia began facing a different invasion, one of unethical practices enabled by AI. Literature reviews, analyses, and even full papers could be machine-generated, raising profound ethical and technical concerns about the future of science. AI has made academic misconduct easier, faster, and harder to detect. Deep-learning systems can fabricate datasets, manipulate images, and generate experimental results that appear plausible at first glance.
READ MORE
Portable kitchen: Designer taps into space-saving trend
Kenya urged to pilot AI regulatory Sandbox in bid to lead Africa's digital future
MPs pledge site visist as KTDA gives progress on hydro power project
Why Gen Zs are not sending money to parents
The true impact of Iran-US war on the Kenyan economy
KPA steps up plans for expansion of Kisumu Port
Infrastructure, trust key to cities success as Nairobi, Rome stagnate
HF Group posts 40pc jump in full-year net profit to Sh1.4 billion
How personalised developments are reshaping local property market
Government tightens oversight on Saccos to safeguard members' deposits
While some AI-generated or falsified studies have been retracted, they are often discovered only through painstaking peer review or whistleblowing. This reality risks turning peer reviewers into forensic investigators, diverting time and resources from evaluating scientific merit to policing authenticity. Text generation poses an equally serious challenge. AI-produced papers may look original but often amount to algorithmic rearrangements of existing work.
Worse still, AI tools frequently invent references nonexistent articles, misattributed authors, or incorrect titles. Such pseudo-scholarship undermines intellectual property, clogs academic journals, and erodes trust in scientific publishing. The difficulty of identifying AI-generated text further compounds the problem, allowing questionable research to circulate unnoticed.
In higher education, the implications are just as troubling where AI has intensified the “technological arms race” between cheating methods and detection systems. Students and even academics can submit work without conducting genuine research, blurring the line between assistance and deception. Misusing AI effectively becomes a form of misattributed authorship—claiming credit for work not personally done.
While AI can also aid in preventing cheating through tools like watermarking or identity verification, its ethical use must be clearly taught and enforced. No chatbot can conduct laboratory experiments, test materials, or observe biological processes in real-world conditions.
AI itself is not the enemy, used responsibly, it can enhance data analysis, language editing, literature searches, and discovery. The danger lies in treating it as a substitute for the scientific method rather than a complement to it. Science advances through curiosity, rigour, and engagement with reality—qualities no algorithm can replicate. As academia navigates this new era, it must reaffirm the values of integrity, proper attribution, and hands-on inquiry.
-The writer is a researcher (Energy Systems)