
Today, we've published a report providing a unique insight into the online supply pathway of in-app ads which harmfully objectify women. It is the first time the ASA has undertaken systematic monitoring of ads in mobile apps.
We found relatively few instances of harmful in-app ads in our monitoring (8 of 5,923 ads served to our avatars). However, those we did find were serious breaches of the rules and we take a zero-tolerance approach to them. The ASA upheld complaints about 11 similar ads in the period 2023 – 2024.
Harmful ads for apps we found in our three-month monitoring of 14 gaming apps, included in animated content:
-
Harmful stereotyping of women as sexual objects
-
Sexual encounters which were implied to be non-consensual
-
The use of pornographic tropes
We set out to explore through case studies how those ads came to appear and what steps the in-app ad industry could take to reduce the likelihood of similar ads appearing in future.

Our rules hold advertisers responsible for ensuring ads they create do not cause harm. However, other parties involved in serving ads to users play a vital part in protecting people by spotting inappropriate ads and stopping them from being seen by users.
We know from our own research published today that almost half of the UK population is concerned about the depiction of women and girls in ads, with 44% of people concerned about the objectification of women and girls. No one - especially children - should be exposed to ads that reinforce harmful gender stereotypes, especially those that may incite or condone violence against women.
Overall, the report presents a positive picture: most of the companies involved in serving in-game ads are taking steps to make sure that ads don’t portray women and girls in a harmful way. However, it is unacceptable for any ads which reinforce negative stereotypes or depict women in a harmful way to be served to app users. This is in clear breach of our rules.
Our case studies revealed varied reasons for the ads having appeared, notwithstanding they all resulted from irresponsible creative choices by the advertisers.
Other factors identified in the report included misclassification of the nature of the content of the advertised app, which failed to alert the publisher as to the ads’ likely content; and non-UK parties’ lack of knowledge of UK advertising standards. We also identified a particular risk of romance story or AI chat apps (which contain sexual content) breaching the rules around offence and harm by reflecting that content in ads appearing in apps where people would not expect to see sexual content.
We welcome the steps some parties have already taken to proactively review and improve their policies and processes as a result of our investigation.
Following this report, CAP has published specific guidance for in-app ads to help advertisers get it right. We will be further engaging with the in-app ad industry to ensure that guidance is widely seen and understood and will continue to take a strong stance on ads which cause harm.
Read the ASA’s full report here and the White Bullet Methodology report here.
More on
-
Keep up to date
Sign up to our rulings, newsletters and emargoed access for Press. Subscribe now.