How to respond to Bark alerts
Bark alerts include context and details to help you learn more about what’s going on.
-
Check the platform
On the top left of the alert, you can see where the issue was detected. In this example, the alert came from an Instagram post.
-
Look at the timestamp
The timestamp may be the time the content was scanned by Bark, or it may be the time the content was actually sent or received. -
Read the recommended actions
At the bottom of each alert, we include some recommended actions from experts about how to handle the situation, along with helpful links.
Need more information on an alert?
If you don't have enough information to understand an alert and take action, you may want to:
-
Spot-check your child's device or account
Some kids delete content, so it may not be there anymore. However, if it's there you may be able to learn more about the context surrounding the issue. Some devices have a "Recently Deleted" folder that may be helpful to check. -
Rate the alert helpfulness
At the bottom of each alert, you can provide feedback on it. This helps us train our AI to flag things more consistently.
How can I learn more about "This image may contain child nudity" in an alert?
Example:
Bark is required by law to delete potential child nudity from our servers. When we do, the original image/video in the alert is replaced with the gray placeholder above. Sometimes, the system errs on the side of caution with the nudity, meaning it may have added the placeholder for adult nudity.
Follow the recommended steps above to check your child's accounts or devices for the image or video. If you cannot find it, it may have already been deleted. On a Bark Phone, our system will automatically delete images saved to the phone that contain nudity, as part of our upcoming sexting prevention feature.