Soufiane OUMBAREK

Nudity & Violence Detection Extension - Using AI to Detect Harmful Visual Content in Real Time

This extension detects nudity, sexual content, and violence in images while browsing the web. It helps identify sensitive or inappropriate visual content in real time, supporting safer browsing, content moderation, and parental control without storing or sharing user data. - Oumbarek-Soufiane/nudity-violence_detection_extenstion

Add a comment

Replies

Best
Soufiane OUMBAREK
Inspiration: The idea for this project came from my **daily experience surfing the internet**, where I constantly encounter **violent and explicit images**. This kind of content is especially harmful for **teenagers**, who can easily be exposed to it without warnings or protection. I wanted to build something useful that could **help make online platforms safer** by detecting sensitive content automatically and in real time.I encountered several obstacles during development: Problems : - **Gemini 1.5** did not work as expected for my use case - **Gemini 2.5** introduced other limitations and unstable responses - Collecting images and **reading them directly from URLs** caused parsing and format issues - Handling real-time performance without delays was challenging I started by integrating the **Gemini API** for image analysis. My goal was to classify images based on the presence of **nudity or violence**. I Built the Project: The workflow is simple: 1. Collect an image URL 2. Fetch and preprocess the image 3. Send it to the Gemini API 4. Analyze the response and flag sensitive content In theory, this can be summarized as: \[ \text{Safety Decision} = f(\text{Image}) \rightarrow \{\text{Safe}, \text{Sensitive}\} \] where the function \( f \) is powered by the Gemini vision model.