June 25th, 2024

Bots Compose 42% of Overall Web Traffic; Nearly Two-Thirds Are Malicious

Akamai Technologies reports 42% of web traffic is bots, 65% malicious. Ecommerce faces challenges like data theft, fraud due to web scraper bots. Mitigation strategies and compliance considerations are advised.

Read original articleLink Icon
Bots Compose 42% of Overall Web Traffic; Nearly Two-Thirds Are Malicious

Akamai Technologies released a report stating that bots make up 42% of web traffic, with 65% being malicious. The ecommerce sector is heavily impacted by these bots, particularly web scraper bots used for competitive intelligence, espionage, and other harmful activities. These bots pose challenges like stealing data, creating fake sites, and increasing fraud losses. AI botnets are evolving to scrape unstructured data and aid in phishing campaigns. The report suggests mitigation strategies to combat these threats, emphasizing the importance of defending against scraper bots to improve website performance and security. Akamai's research also addresses compliance considerations in light of these attacks. The company's State of the Internet reports provide insights on cybersecurity and web performance trends, offering solutions to protect digital experiences.

Related

OpenAI and Anthropic are ignoring robots.txt

OpenAI and Anthropic are ignoring robots.txt

Two AI startups, OpenAI and Anthropic, are reported to be disregarding robots.txt rules, allowing them to scrape web content despite claiming to respect such regulations. TollBit analytics revealed this behavior, raising concerns about data misuse.

We need an evolved robots.txt and regulations to enforce it

We need an evolved robots.txt and regulations to enforce it

In the era of AI, the robots.txt file faces limitations in guiding web crawlers. Proposals advocate for enhanced standards to regulate content indexing, caching, and language model training. Stricter enforcement, including penalties for violators like Perplexity AI, is urged to protect content creators and uphold ethical AI practices.

The Encyclopedia Project, or How to Know in the Age of AI

The Encyclopedia Project, or How to Know in the Age of AI

Artificial intelligence challenges information reliability online, blurring real and fake content. An anecdote underscores the necessity of trustworthy sources like encyclopedias. The piece advocates for critical thinking amid AI-driven misinformation.

Hackers 'jailbreak' powerful AI models in global effort to highlight flaws

Hackers 'jailbreak' powerful AI models in global effort to highlight flaws

Hackers exploit vulnerabilities in AI models from OpenAI, Google, and xAI, sharing harmful content. Ethical hackers challenge AI security, prompting the rise of LLM security start-ups amid global regulatory concerns. Collaboration is key to addressing evolving AI threats.

What everyone gets wrong about the 2015 Ashley Madison scandal

What everyone gets wrong about the 2015 Ashley Madison scandal

The 2015 Ashley Madison scandal exposed the use of bots to engage users, revealing a trend of fake profiles and automated interactions on social media platforms, cautioning about AI-generated content challenges.

Link Icon 3 comments
By @rapjr9 - 5 months
I wonder if the percentage of the world economy that is based on crime has increased in recent years due to the ease of commiting crimes on the internet. Everybodies identity has been stolen multiple times, everything is hackable; seems like that would have accelerated crime. Maybe it explains the drop in violent crimes, it's so much easier to just buy stolen credit cards on the dark web or buy an automated ransomware attack suite than to fight a gang war? An essential driver behind crime seems to be laziness, criminals don't want to work for a living, they'd rather take resources from others because it seems easier. Cybercrime may be the easiest, lowest risk crime ever (other than forming a corporation :-)
By @Q_is_4_Quantum - 5 months
Random question from a very-much-not-computer savvy person on the off chance someone cares to answer: If a tiny charge was levied every time a webserver delivered a page to me would it cure this kind of problem? I'm imagining e.g. my browser has to send some crypto of some variety (or guarantee that it will in the future so as to not slow things down).