Skip to content

Add:Toxic Comment Classification #1198

@KowshikaSinivasan

Description

@KowshikaSinivasan

Description:
I propose adding a Toxic Comment Classification feature that classifies comments as "toxic," or "not" The implementation will.

Solution:
Preprocess the text (remove stopwords, punctuation, lowercase).
Use TF-IDF for feature extraction.
Train a simple classifier.
Evaluate using accuracy and F1-score.

Alternatives:
Pretrained Models (e.g., BERT): These are more complex and require higher resources. Starting with a simpler approach is easier for beginners.
Manual Moderation: It’s time-consuming and not scalable compared to an automated model.

Kindly assign me this issue.

Metadata

Metadata

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions