Instagram will begin notifying parents if their teenagers repeatedly search for suicide or self-harm related content, marking the first time owner Meta has proactively flagged search behaviour rather than simply blocking it. From next week, parents and teenagers enrolled in Instagram’s “Teen Accounts” supervision programme in the UK, US, Australia and Canada will receive alerts if a young user searches for harmful terms within a short period of time. The feature will be rolled out...










