This is the every-frame-in-order bot template that you can deploy anywhere, currently it's run as Every Danmachi Frame in Order
- yt-dlp
- ffmpeg
- Python 3.12 (this environment uses 3.12.10)
- DBMS (MySQL for this template)
- ~15GB disk space (this is based on the number of frames)
-
Try to find your favorite anime included in a YouTube playlist
-
Extract the video titles and their video ids like:
yt-dlp --flat-playlist --print "%(playlist_index)s - %(title)s: %(id)s" "youtube_playlist_url" > output.txt
-
Pack these information into a csv with the following columns:
video_id youtube_title index season_episode title done id youtube_title playlist_index S01E01 title False ... ... ... S01E02 ... ... -
Create an environment for the .py scripts using the command:
(For Linux users, jump to Deploy on Ubuntu Linux section for step 4-5 and 8-9)python -m venv .venv
If using VSCode, the venv will automatically activate for you
-
Install packages for venv
pip install -r requirements.txt
-
Run e.py
- you probably need a different rule for extracting
anime-episode:titlefrom youtube_title if the current method doesn't fit - the example fps is 2 frames per second and having a jpg quality of 5, feel free to adjust
- the frames will be stored in Frames directory
- you probably need a different rule for extracting
-
Once
playlist.csvis created, run writedb.py- the df1 df2 df3 code block is for rearranging episode order, you can ignore if not needed
-
Create a database and a table from your DB choice. For the table (storing posting information), you can have the schema like this:
DROP TABLE IF EXISTS db.anime_name; /*!40101 SET @saved_cs_client = @@character_set_client */; /*!50503 SET character_set_client = utf8mb4 */; CREATE TABLE db.anime_name ( `id` int NOT NULL AUTO_INCREMENT, `episode` varchar(10) NOT NULL, `title` varchar(200) NOT NULL, `frame_start` int NOT NULL, `frame_end` int NOT NULL, `filename` varchar(100) NOT NULL, `post_time` datetime DEFAULT NULL, PRIMARY KEY (`id`), UNIQUE KEY `uniq_episode_frame` (`filename`) ) ENGINE=InnoDB AUTO_INCREMENT=262141 DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_0900_ai_ci;
-
Import
df.csvinto the table
- Create an app in Facebook Developer
- Add use cases for facebook page, at least
pages_show_list,pages_read_engagementandpages_manage_postsare required - Go to Meta Business Suite. Click the cog icon in the bottom left and then click pages from account, link your facebook page and grant yourself total control
- Create a system user granting him full control for both app and page
- Generate a system user access token: choose the app linking to your page, token expiration never, grant necessary use cases and finally copy the token
- Go to Graph API Test Tool then GET ... me/accounts, this shows up the pages that this system user has a role on. Locate
access_tokeninside the curly brackets, check the name is the dedicated facebook page and then copyaccess_token - Paste the access token into Graph API Access Token Debug Tool and check if the token type is Page and the expiration date is Never. The never-expired access token, page id and app id are needed for bot setup
- Sign up for Sentry.io
- Get Sentry DSN from the instruction. The DSN is needed for the bot setup
- Copy and paste SQL connection information facebook-related token, id and sentry dsn into .env (
HOST,USER,...), (PAGE_ACCESS_TOKEN,PAGE_ID,APP_ID), (SENTRY_DSN)
if you are deploying this on Ubuntu Linux, keepHOSTempty - Adjust post interval (in seconds) in config.py if needed, the default is 600
python job_scheduler.pyMySQL server (or any other DBMS) should be installed directly on the machine
-
install python3.12, pip3, venv and activate venv
sudo apt update sudo add-apt-repository ppa:deadsnakes/ppa sudo apt install python3.12 python3-pip python3-venv python3 -m venv .venv
-
install packages for venv
pip3 install -r requirements.txt
-
do step 6-7 at Frame Preprocess and Environment Preparation section
-
install mysql server
sudo apt update sudo apt install mysql-server
-
set root password and secure installation
sudo mysql_secure_installation
-
start service and login
sudo systemctl start mysql sudo mysql
-
import df.csv into the table
LOAD DATA INFILE 'df.csv' INTO TABLE db.table FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n' IGNORE 1 ROWS;
-
have your access tokens and sentry DSN from Get Meta Graph API Page Access Token and Get Sentry DSN sections and put them in .env file
-
run
python3 job_scheduler.py
Make sure you have finished step 1-8 above
-
allow connection from container
CREATE USER 'your_user'@'172.17.0.%' IDENTIFIED BY 'your_password'; GRANT ALL PRIVILEGES ON your_db.* TO 'your_user'@'172.17.0.%'; FLUSH PRIVILEGES;
-
exit mysql and run
sudo ufw allow 3306/tcpto allow connection from outside,hostname -Ito get your machine IP address and changeHOSTvalue with the address in .env file -
change
bind-addressto0.0.0.0in/etc/mysql/mysql.conf.d/mysqld.cnf
-
build and wait for container to be built (check cd path)
docker build -t anime-bot . -
run container
docker run -d --name anime-bot -v $PWD/Frames:/app/Frames anime-bot
- check recent posts
docker logs --tail 20 anime-bot
- check if posts get published with HTTP 500 (yes, meta graph API is really untrustable)
docker logs anime-bot 2>&1 | grep -A 1 -B 1 Recovered
Bot started...
Posting Episode S01E01, File: Frames/S01E01/S01E01_1.jpg
Success: 1: File: Frames/S01E01/S01E01_1.jpg
Posting Episode S01E01, File: Frames/S01E01/S01E01_2.jpg
...
- If error occurred or you want to stop the bot, hit Ctrl + C in the terminal
- ESFIO
- 每一個BanG Dream Its Mygo幀
- ChatGPT