One in four teenagers suffered hate abuse online in the last year, a major report has revealed.
A survey of 13 to 18-year-olds found 24% reported that they were targeted on the internet because of their gender, sexual orientation, race, religion, disability or transgender identity.
One in 25 said they are singled out for abuse “all or most of the time”.
The alarming findings emerged in a study published to mark Safer Internet Day.
It revealed that more than four in five (82%) youngsters have seen or heard “online hate” in the previous 12 months, with 41% suggesting it had become more rife.
Researchers defined online hate as behaviour targeting people or communities via the internet because of their gender, transgender identity, sexual orientation, disability, race, ethnicity, nationality or religion.
It could be offensive, mean or threatening, and either targeted directly at a person or group, or generally shared online. In its most extreme form it can break the law and become a hate crime.
Social media platforms were the most common domains in which youngsters witnessed hate on the internet, according to the report.
It said youngsters with disabilities and those from black, Asian, Middle Eastern or other minority ethnic communities were more likely to see online hate.
The survey of more than 1,500 teenagers also found 93% of respondents have seen their friends posting supportive, kind of positive content about a certain group in the last year.
Will Gardner, chief executive of the charity Childnet and director of the UK Safer Internet Centre, which published the study, said: “While it is encouraging to see that almost all young people believe no one should be targeted with online hate, and heartening to hear about the ways young people are using technology to take positive action online to empower each other and spread kindness, we were surprised and concerned to see that so many had been exposed to online hate in the last year.
“It is a wake-up call for all of us to play our part in helping create a better internet for all, to ensure that everyone can benefit from the opportunities that technology provides for building mutual respect and dialogue, facilitating rights, and empowering everyone to be able to express themselves and be themselves online - whoever they are.”
Peter Wanless, chief executive of the NSPCC, said the internet industry has a duty to keep young people safe.
“Socialising online is central to children and young people’s lives today, so it’s very worrying that so many are witnessing or experiencing online hate,” he said.
Education Secretary Nicky Morgan said: “The internet is a powerful tool which can have brilliant and virtually limitless benefits, but it must be used sensibly and safely.
“We are working hard to make the web a safer place for children but we can’t do it alone and parents have a vital role to play in educating young people.
Online abuse and cyber bullying have emerged as major issues alongside the explosion in popularity of social media sites.
Last year figures showed convictions for crimes under a law to prosecute internet “trolls” have increased eight-fold in a decade.
A separate survey found that more than three quarters of 10 to 12 year olds in the UK had social media accounts even though they are below the age limit.
Facebook was the most widely used among 10 to 12 year olds at 49%, while Instagram is used by 41% of the same age group, BBC Newsround said.
But both sites require UK account holders to be over 13 before signing up and urge users to report accounts they believe are used by under-age children.