The following table describes the status, the proposer and other metadata of this motion.
| Consultation: | Winter General Meeting 2026 |
|---|---|
| Agenda item: | 3. Motions of Policy and Organisation |
| Proposer: | Isaac Short (Durham Green Party) |
| Status: | Published |
| Submitted: | 01/21/2026, 19:44 |
Comments
Amber Lewis:
While no one can deny the harm social media has caused, this is not a viable solution.
Amber Lewis:
Isaac Short:
Charlie Aldous:
Isaac Short:
Jamie Strudwick:
Haydn Osborne-Brookes:
This is a view we were planning on expressing through a statement on social media (ironic!), however you told us not to put out a statement on the issue as you said there "may be a motion about it brought to WGM" and that "it's not a pressing issue". You did not make it clear that this motion was being submitted by yourself, and it does seem that we have missed the media storm around this issue.
It is a regularly done procedure for the Young Greens Exec to write policy statements and vote on them in the absence of any policy (to add to this, the national party only has policy for about 5% of votes in parliament - I'd imagine we have less), and by telling us not to put the statement out it does feel like you have interrupted this procedure purely because you disagree with our position. This links again to the "nothing about us without us" point, as it feels a bit unsettling to not consult us about this policy initially, considering the group we represent, and then proceeding to go out of your way to stop our own views on it being shared in favour of having policy administered to us, rather than with us as is regular procedure.
As for the content of this motion, a ban on social media for under 16s really wouldn't be an effective means of solving the issues we see around it. Yes, social media can have an extremely bad effect on people's mental health, but this effect doesn't suddenly disappear on someone's 16th birthday? The solution to these issues is holding social media companies accountable for the effects their platforms have on the mental health of people of ALL ages, not those who use the platforms. I would argue this is a principle we believe in for multiple areas, whether it be carbon taxes for polluters or tenants rights regulations for landlords.
There are also serious questions around how this would actually be enforced? We have seen with the Online Safety Act how unenforceable and dangerous bans like this can be. Many people can just avoid them using VPNs, and this would likely be checked by having people upload their IDs to websites and social media platforms, which has led to data breaches since the Online Safety Act (which I believe we oppose?) has been introduced, leading to people's personal data being freely available.
I'm also a bit confused on how you think your exact policy here will be effective? You have said that you'd like to see under 16s banned from posting and interacting with content, but not from actually viewing it. From my understanding, a lot of the harm on social media is actually caused by algorithms that push dangerous content on to young people? Obviously I'm not arguing for this to be banned, but it seems to circumvent your logic here.
Finally, this motion would essentially force myself and Esther to work with the national party to make a ban on social media for Under 16s their policy, and to support any legislation about this in parliament! We would both feel really uncomfortable about this, and I'm sure a lot of the people we represent would not be happy with it either.
Isaac Short:
In your position as joint Under 18s officer, you have a right to have your opinion heard about this motion and any motion that affects Under 18s, but that doesn't mean you have to be consulted on every motion that will affect Under 18s. I felt I did not need to discuss this, I was writing this motion from the perspective of someone who has significant personal experience with social media and the harm that it inflicts on people under the age of 18. My own life experiences, along with research online, are the consultations I had with Under 18s.
As to the contents, yes the effects don't just end on your 16th birthday, but the effects of alcohol or gambling or smoking don't just end on your 18th birthday. The age limits of regulated goods and services are set to a point where people can make an informed decision about interacting with those goods and services. This motion also does not absolve social media companies from guilt, nor detract from the legislation attempting to make social media safer. This motion is not the ideal solution, but no other solutions have worked, and social media companies have proven time and time again that they will circumvent any attempts to enforce content moderation on them. All they care about is there bottom line, not user safety. They don't self-regulate, and cannot be trusted to.
Social media is also inherently addictive. No matter what laws and regulations we put towards social media companies, it will always be addictive. This means it will never be safe for children, and so just like other addictive goods and services, there should be restrictions in place.
Enforcement would be down to social media companies. If they fail to enforce it correctly, and loop holes are able to be exploited, or data is leaked, they would be fined for that, as is the system with the Australian ban. It is deliberately done like this to avoid the issues with the Online Safety Act. The Online Safety Act also does not cover an addictive good or service, whereas this does, meaning they are two separate issues which should not be equated.
The majority of the harm of social media does come from the content served to people using an algorithm. The point your missing here is the way that the algorithms work. Algorithms work by looking at how a user with an account interacts with content. If you don't have an account and cannot interact with content, the algorithm cannot learn about you, assign specific learnings about you nor push personalised content to you. The really dangerous stuff is the personalised content. Removing the ability to create accounts and interact with content also removes the potential for cyber bullying.
Finally, if you or Esther would feel uncomfortable working with the national party on this, then that's your right. The Young Greens executive and officers are elected not to impart their own opinions to the national party, but to put the concerns and opinions of all Young Greens members. If this passed and you or Esther feel you can no longer represent the beliefs of the Young Greens as a whole due to your own beliefs, then it is your prerogative to either resign or ask someone else to take the responsibility of promoting this motion instead.
I hope this response answers your concerns, but feel free to put any more issues you have below.
Charlie Aldous:
Would you consider redrafting to reflect that a limiting of social media content on the basis of vulnerability in a similar way we would review someones capacity for other tasks? (this redraft would likely best be done in consultation with U-18 Greens, Disabled Greens and Senior Greens)
Isaac Short:
Amber Lewis:
Isaac Short:
Amber Lewis:
People who risk dying, because all of their support remains online. That this ban would risk destroying their access to
Leigh Williams :
This motion isn't just excluding young people from community, it's prohibiting their ability to communicate at all