Myrient Collect Links with Filters

Collect, filter links, and then copy to your clipboard with a single button click. filters for links containing "(USA)" and excludes "(demo)" and "(kiosk)". Is fairly easy to modify.

您需要先安裝使用者腳本管理器擴展,如 TampermonkeyGreasemonkeyViolentmonkey 之後才能安裝該腳本。

You will need to install an extension such as Tampermonkey to install this script.

您需要先安裝使用者腳本管理器擴充功能,如 TampermonkeyViolentmonkey 後才能安裝該腳本。

您需要先安裝使用者腳本管理器擴充功能,如 TampermonkeyUserscripts 後才能安裝該腳本。

你需要先安裝一款使用者腳本管理器擴展,比如 Tampermonkey,才能安裝此腳本

您需要先安裝使用者腳本管理器擴充功能後才能安裝該腳本。

(我已經安裝了使用者腳本管理器,讓我安裝!)

你需要先安裝一款使用者樣式管理器擴展,比如 Stylus,才能安裝此樣式

你需要先安裝一款使用者樣式管理器擴展,比如 Stylus,才能安裝此樣式

你需要先安裝一款使用者樣式管理器擴展,比如 Stylus,才能安裝此樣式

你需要先安裝一款使用者樣式管理器擴展後才能安裝此樣式

你需要先安裝一款使用者樣式管理器擴展後才能安裝此樣式

你需要先安裝一款使用者樣式管理器擴展後才能安裝此樣式

(我已經安裝了使用者樣式管理器,讓我安裝!)

作者
Dethkiller15
今日安裝
0
安裝總數
15
評價
0 0 0
版本
1.0
建立日期
2024-12-18
更新日期
2024-12-18
尺寸
2.2 KB
授權條款
MIT
腳本執行於

I like batch downloading things and not manually doing things myself. Who doesn't like seeing their storage fill with the things they want?(even if they will never use it themselves lol) so I "made" this.

The code is fairly easy to modify. Press the button that shows up on the bottom left and it will throw all the links into your clipboard. Filters for links containing "(USA)" and excludes "(demo)" and "(kiosk)". Throw that into a .txt file and use your favorite downloader to get the files.

For those who don't want to use a downloader with that functionality here is a .bat script for you.

Put this in the same location as a urls.txt file and run the .bat file. This will download all urls in the "urls.txt" file into a folder it creates called "urls.txt downloads" within the same directory.
@echo off
:: Define the input file and download folder
set "inputFile=urls.txt"
set "outputDir=%~dp0urls.txt downloads"

:: Check if urls.txt exists
if not exist "%inputFile%" (
echo Error: "%inputFile%" not found in the current directory.
pause
exit /b
)

:: Create the download folder if it doesn't exist
if not exist "%outputDir%" (
mkdir "%outputDir%"
)

:: Download each URL from urls.txt
echo Starting downloads...
for /f "usebackq delims=" %%u in ("%inputFile%") do (
echo Downloading: %%u
curl -L -o "%outputDir%\%%~nxu" "%%u"
)

echo All downloads completed. Files saved in "%outputDir%".
pause