How To Retrieve Shorts URLs From a YouTube Channel in Python

In this post, you will learn two methods on How To Retrieve All Shorts URLs From a YouTube Channel in Python.

To create a Python script that retrieves all the shorts video links from a YouTube channel, you can utilize the YouTube Data API v3. The process involves querying the API for videos from a specific channel, filtering for those with the #shorts in their description or title, and then extracting their video links.

Method 1. Retrieve All Shorts URLs From a YouTube Channel in Python by Using Google API Client Library

Step 1: Get a YouTube Data API Key

  1. Go to the Google Developers Console.
  2. Create a new project.
  3. Navigate to Library, search for YouTube Data API v3, and enable it for your project.
  4. Go to Credentials, create an API key, and note it down.

Step 2: Install Google API Client Library

You need the Google API Client library for Python. Install it using pip:

pip install --upgrade google-api-python-client

Step 3: Python Script to Retrieve Shorts Links

Below is a basic script to start with. This script will list videos from a specified channel, checking for videos that are considered shorts (less than 60 seconds, though not all shorts strictly adhere to this, especially with YouTube’s evolving definition).

from googleapiclient.discovery import build
import os

# User variables
API_KEY = 'YOUR_API_KEY'  # Replace with your YouTube Data API v3 key
CHANNEL_ID = 'CHANNEL_ID_HERE'  # Replace with the Channel ID you're interested in

youtube = build('youtube', 'v3', developerKey=API_KEY)

def get_shorts_video_links(channel_id):
    video_links = []
    next_page_token = None

    while True:
        request =
            maxResults=50,  # Adjust based on how many results you want per page (max 50)
        response = request.execute()

        for item in response.get('items', []):
            # Check if video description or title contains '#shorts'
            if '#shorts' in item['snippet']['description'].lower() or '#shorts' in item['snippet']['title'].lower():
                video_id = item['id']['videoId']
                video_link = f"{video_id}"

        next_page_token = response.get('nextPageToken')
        if not next_page_token:

    return video_links

if __name__ == "__main__":
    shorts_links = get_shorts_video_links(CHANNEL_ID)
    for link in shorts_links:

Replace YOUR_API_KEY with your actual API key and CHANNEL_ID_HERE with the channel ID from which you want to retrieve shorts.

How It Works:

  • The script uses the search().list method to find videos by the specified channel ID.
  • It checks each video to see if its description or title contains #shorts.
  • It compiles a list of video links that match the criteria.
  • Note: The API’s search().list method does not directly support filtering by video length or explicitly identifying shorts, so this script uses the presence of #shorts as a heuristic.

Limitations and Considerations:

  • Quota Costs: The YouTube Data API has quota limits. Listing videos (search().list) consumes quota. Monitor your usage in the Google Developers Console.
  • Accuracy: This script assumes that any video from the channel with #shorts in the title or description is a shorts video. Adjust the filtering logic based on your specific needs or if YouTube’s definition of shorts changes.

Remember to respect YouTube’s API Terms of Service and Google’s API User Data Policy when using the API.

Method 2. Retrieve All Shorts URLs From a YouTube Channel in Python by Web Scraping.

Retrieve All Shorts URLs From a YouTube Channel in Python by Web Scraping.

In this method, we gonna use Python and BeautifulSoup to web scrape all the links from a YouTube channel.

Retrieving YouTube shorts video links from a channel without using the YouTube Data API v3 involves alternative methods, such as web scraping, which has limitations and legal considerations. Web scraping YouTube can violate their Terms of Service, so reviewing YouTube’s terms is essential before proceeding.

If you still want to explore how it could be theoretically done for educational purposes, you could use libraries like requests and BeautifulSoup in Python to scrape the channel’s page for video links. However, keep in mind that this approach is not recommended or supported by YouTube, and it’s prone to breaking if YouTube changes their website’s structure.

Here’s an example of how you could theoretically scrape a webpage to find video links with Python. This code does not specifically target YouTube or any other service and is provided purely for educational purposes:

import requests
from bs4 import BeautifulSoup

# Example URL, replace with the actual channel/videos page you're interested in
url = ''

# Send a request to the URL
response = requests.get(url)

# Parse the HTML content of the page
soup = BeautifulSoup(response.text, 'html.parser')

# Find all video links, hypothetical example
video_links = soup.find_all('a', href=True)

for link in video_links:
    # Print the href attribute of each <a> tag (link)

Replace with your actual YouTube channel link from which you want to retrieve shorts.

Considerations for a Hypothetical YouTube Scraper:

  • Target Elements: YouTube’s structure is complex, and videos are loaded dynamically using JavaScript, making BeautifulSoup alone insufficient for scraping content that isn’t loaded in the initial HTML response.
  • JavaScript Rendering: To scrape a JavaScript-heavy site like YouTube, you’d need a tool capable of executing JavaScript, such as Selenium or Puppeteer (though Puppeteer is a Node.js library, there are Python equivalents like Pyppeteer).
  • Legal and Ethical Considerations: Scraper scripts can put a high load on websites, violate terms of service, and lead to your IP being banned. Always use respectful scraping practices, such as rate limiting your requests, and consider the legal implications.

Using the YouTube Data API is strongly recommended for a reliable, supported, and legal method of retrieving video links from a YouTube channel.

Recent Articles

Related Stories

Stay on op - Ge the daily news in your inbox