Coder Social home page Coder Social logo

youtube-to-chatbot's Introduction

YouTube-to-Chatbot ๐Ÿค–๐ŸŽฅ

YouTube-to-chatbot logo

๐Ÿ‘‰ Original launch announcement ๐Ÿš€

Introduction

Welcome to YouTube-to-Chatbot, a Python notebook that allows you to train a chatbot on an entire YouTube channel. ๐ŸŒŸ

This repository provides a notebook that utilizes the power of YouTube, OpenAI, Langchain, and Pinecone to build a conversational agent capable of mimicking the content, knowledge, and tone of any YouTube channel. By extracting information from the channel's videos and training a chatbot, we can create an AI-powered assistant that engages in meaningful conversations with users.

YouTube-to-Chatbot Demo

๐Ÿง‘โ€๐Ÿซ How it Works

  1. Start by adding the YouTube ID of the channel you'd like to clone.
  2. Obtain API keys for OpenAI, YouTube, and Pinecone.
  3. Run each step of the notebook to extract data from YouTube, train the chatbot, and deploy the model.
  4. Interact with your newly created chatbot and witness its ability to hold intelligent conversations based on the channel's content.

๐Ÿ’ฌ Benefits for Creators and Communities

YouTube-to-Chatbot aims to unlock new possibilities for content creators and foster community growth. With this project, creators can:

  • Provide an interactive and engaging experience for their audience.
  • Offer personalized recommendations and responses to viewers.
  • Automate routine tasks such as answering frequently asked questions.
  • Expand their reach by enabling chatbot interactions across various platforms.

๐ŸŽฏ Early Access and Contributions

To get early access to new features and updates, follow me at @ehalm_ on Twitter. This is just the start! If you are a creator interested in a custom model or a developer eager to contribute to this project, feel free to shoot me a DM.

๐Ÿ Getting Started

To start using YouTube-to-Chatbot, follow these steps:

  1. Clone this repository to your local machine.
  2. Access the notebook here to run it on Google Colab.
  3. Make sure you have the necessary API keys and permissions.
  4. Fill in the required information in the notebook, such as the YouTube ID and API keys.
  5. Run each step in the notebook to train and deploy your chatbot.
  6. Engage in conversations with your AI assistant and explore its capabilities!

Let's empower creators and revolutionize community engagement together! ๐Ÿš€โœจ

youtube-to-chatbot's People

Contributors

emmethalm avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

youtube-to-chatbot's Issues

cell 2

name 'googleapiclient' is not defined

NameError Traceback (most recent call last)
Cell In[3], line 59
56 print(transcripts)
57 write_to_file(transcripts)
---> 59 main(api_key, channel_id)

Cell In[3], line 54, in main(api_key, channel_id)
53 def main(api_key, channel_id):
---> 54 video_ids = get_channel_videos(channel_id, api_key)[:20]
55 transcripts = get_transcripts(video_ids)
56 print(transcripts)

Cell In[3], line 10, in get_channel_videos(channel_id, api_key)
9 def get_channel_videos(channel_id, api_key):
---> 10 youtube = googleapiclient.discovery.build(
11 "youtube", "v3", developerKey=api_key)
15 video_ids = []
16 page_token = None

NameError: name 'googleapiclient' is not defined

I als think YouTubeTranscriptApi in the def get_transcripts(video_ids): is not defined.

I used the YouTube Data API is that correct? you should make clear which API key to use in that cell

All the best!

Status: 404 Response

Installing collected packages: protobuf
Attempting uninstall: protobuf
Found existing installation: protobuf 3.19.3
Uninstalling protobuf-3.19.3:
Successfully uninstalled protobuf-3.19.3
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
tensorflow-metadata 1.15.0 requires protobuf<4.21,>=3.20.3; python_version < "3.11", but you have protobuf 4.25.3 which is incompatible.
Successfully installed protobuf-4.25.3

TransportError Traceback (most recent call last)
/usr/local/lib/python3.10/dist-packages/google/auth/compute_engine/credentials.py in refresh(self, request)
127 try:
--> 128 self._retrieve_info(request)
129 self.token, self.expiry = _metadata.get_service_account_token(

12 frames
TransportError: ("Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true from the Google Compute Engine metadata service. Status: 404 Response:\nb''", <google_auth_httplib2._Response object at 0x7f7dfe064f10>)

The above exception was the direct cause of the following exception:

RefreshError Traceback (most recent call last)
/usr/local/lib/python3.10/dist-packages/google/auth/compute_engine/credentials.py in refresh(self, request)
132 except exceptions.TransportError as caught_exc:
133 new_exc = exceptions.RefreshError(caught_exc)
--> 134 raise new_exc from caught_exc
135
136 @Property

RefreshError: ("Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true from the Google Compute Engine metadata service. Status: 404 Response:\nb''", <google_auth_httplib2._Response object at 0x7f7dfe064f10>)

Feature request: document how much it costs to scrape a channel, for reference

Hi,

because this uses various API keys, especially chatgpt API, it means users need to buy credits to setup the system.

Do you mind sharing an example, using a youtube channel as reference, sharing how many videos it has, and how much it costs to scrape everything, I mean for the initial setup I assume it also uses chatgpt?

Then I guess how much it costs to use it, it depends on the chat, but an example could also be beneficial for new users.

Pinecone Starter (Free) plan works?

Thank you

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.