summaryrefslogtreecommitdiffstats
diff options
context:
space:
mode:
authorTekky <98614666+xtekky@users.noreply.github.com>2023-09-11 13:35:01 +0200
committerGitHub <noreply@github.com>2023-09-11 13:35:01 +0200
commit3bde2c06f3581fa095317bad8ae40b15d55877cb (patch)
tree8c36f59a8c08b1ff35038d672fe0af390fa645d2
parent~ | Merge pull request #891 from Lin-jun-xiang/fix-TypeDict-error (diff)
parentMerge branch 'main' into feature/docker-setup (diff)
downloadgpt4free-3bde2c06f3581fa095317bad8ae40b15d55877cb.tar
gpt4free-3bde2c06f3581fa095317bad8ae40b15d55877cb.tar.gz
gpt4free-3bde2c06f3581fa095317bad8ae40b15d55877cb.tar.bz2
gpt4free-3bde2c06f3581fa095317bad8ae40b15d55877cb.tar.lz
gpt4free-3bde2c06f3581fa095317bad8ae40b15d55877cb.tar.xz
gpt4free-3bde2c06f3581fa095317bad8ae40b15d55877cb.tar.zst
gpt4free-3bde2c06f3581fa095317bad8ae40b15d55877cb.zip
-rw-r--r--Dockerfile33
-rw-r--r--README.md98
-rw-r--r--docker-compose.yml13
3 files changed, 128 insertions, 16 deletions
diff --git a/Dockerfile b/Dockerfile
new file mode 100644
index 00000000..36ca12f1
--- /dev/null
+++ b/Dockerfile
@@ -0,0 +1,33 @@
+# Use the official lightweight Python image.
+# https://hub.docker.com/_/python
+FROM python:3.9-slim
+
+# Ensure Python outputs everything immediately (useful for real-time logging in Docker).
+ENV PYTHONUNBUFFERED 1
+
+# Set the working directory in the container.
+WORKDIR /app
+
+# Update the system packages and install system-level dependencies required for compilation.
+# gcc: Compiler required for some Python packages.
+# build-essential: Contains necessary tools and libraries for building software.
+RUN apt-get update && apt-get install -y --no-install-recommends \
+ gcc \
+ build-essential \
+ && rm -rf /var/lib/apt/lists/*
+
+# Copy the project's requirements file into the container.
+COPY requirements.txt /app/
+
+# Upgrade pip for the latest features and install the project's Python dependencies.
+RUN pip install --upgrade pip && pip install -r requirements.txt
+
+# Copy the entire project into the container.
+# This may include all code, assets, and configuration files required to run the application.
+COPY . /app/
+
+# Install additional requirements specific to the interference module/package.
+RUN pip install -r interference/requirements.txt
+
+# Define the default command to run the app using Python's module mode.
+CMD ["python", "-m", "interference.app"]
diff --git a/README.md b/README.md
index 784b79be..d57a542c 100644
--- a/README.md
+++ b/README.md
@@ -1,4 +1,3 @@
-
![248433934-7886223b-c1d1-4260-82aa-da5741f303bb](https://github.com/xtekky/gpt4free/assets/98614666/ea012c87-76e0-496a-8ac4-e2de090cc6c9)
By using this repository or any code related to it, you agree to the [legal notice](./LEGAL_NOTICE.md). The author is not responsible for any copies, forks, reuploads made by other users, or anything else related to gpt4free. This is the author's only account and repository. To prevent impersonation or irresponsible actions, please comply with the GNU GPL license this Repository uses.
@@ -7,54 +6,69 @@ This (quite censored) New Version of gpt4free, was just released so it may conta
P.S: Docker is for now not available but I would be happy if someone contributes a PR. The g4f GUI will be uploaded soon enough.
### New
+
- pypi package:
+
```
pip install -U g4f
```
## Table of Contents:
+- [Table of Contents:](#table-of-contents)
- [Getting Started](#getting-started)
- + [Prerequisites](#prerequisites)
- + [Setting up the project](#setting-up-the-project)
+ - [Prerequisites:](#prerequisites)
+ - [Setting up the project:](#setting-up-the-project)
+ - [Install using pypi](#install-using-pypi)
+ - [or](#or)
+ - [Setting up with Docker:](#setting-up-with-docker)
- [Usage](#usage)
- * [The `g4f` Package](#the-g4f-package)
- * [interference openai-proxy api](#interference-openai-proxy-api-use-with-openai-python-package)
+ - [The `g4f` Package](#the-g4f-package)
+ - [interference openai-proxy api (use with openai python package)](#interference-openai-proxy-api-use-with-openai-python-package)
- [Models](#models)
- * [gpt-3.5 / gpt-4](#gpt-35--gpt-4)
- * [Other Models](#other-models)
+ - [gpt-3.5 / gpt-4](#gpt-35--gpt-4)
+ - [Other Models](#other-models)
- [Related gpt4free projects](#related-gpt4free-projects)
- [Contribute](#contribute)
- [ChatGPT clone](#chatgpt-clone)
-- [Copyright](#copyright)
-- [Copyright Notice](#copyright-notice)
+- [Copyright:](#copyright)
+- [Copyright Notice:](#copyright-notice)
- [Star History](#star-history)
## Getting Started
#### Prerequisites:
+
1. [Download and install Python](https://www.python.org/downloads/) (Version 3.x is recommended).
#### Setting up the project:
+
##### Install using pypi
+
```
pip install -U g4f
```
##### or
-1. Clone the GitHub repository:
+1. Clone the GitHub repository:
+
```
git clone https://github.com/xtekky/gpt4free.git
```
+
2. Navigate to the project directory:
+
```
cd gpt4free
```
+
3. (Recommended) Create a virtual environment to manage Python packages for your project:
+
```
python3 -m venv venv
```
+
4. Activate the virtual environment:
- On Windows:
```
@@ -65,20 +79,66 @@ python3 -m venv venv
source venv/bin/activate
```
5. Install the required Python packages from `requirements.txt`:
+
```
pip install -r requirements.txt
```
6. Create a `test.py` file in the root folder and start using the repo, further Instructions are below
+
```py
import g4f
...
```
+##### Setting up with Docker:
+
+If you have Docker installed, you can easily set up and run the project without manually installing dependencies.
+
+1. First, ensure you have both Docker and Docker Compose installed.
+
+ - [Install Docker](https://docs.docker.com/get-docker/)
+ - [Install Docker Compose](https://docs.docker.com/compose/install/)
+
+2. Clone the GitHub repo:
+
+```bash
+git clone https://github.com/xtekky/gpt4free.git
+```
+
+3. Navigate to the project directory:
+
+```bash
+cd gpt4free
+```
+
+4. Build the Docker image:
+
+```bash
+docker compose build
+```
+
+5. Start the service using Docker Compose:
+
+```bash
+docker compose up
+```
+
+You server will now be running at `http://localhost:1337`. You can interact with the API or run your tests as you would normally.
+
+To stop the Docker containers, simply run:
+
+```bash
+docker compose down
+```
+
+**Note:** When using Docker, any changes you make to your local files will be reflected in the Docker container thanks to the volume mapping in the `docker-compose.yml` file. If you add or remove dependencies, however, you'll need to rebuild the Docker image using `docker compose build`.
+
## Usage
### The `g4f` Package
+
```py
import g4f
@@ -118,6 +178,7 @@ for message in response:
print(message)
```
+
##### Providers:
```py
from g4f.Provider import (
@@ -213,14 +274,16 @@ async def run_async():
asyncio.run(run_async())
```
-### interference openai-proxy api (use with openai python package)
+### interference openai-proxy api (use with openai python package)
get requirements:
+
```sh
pip install -r interference/requirements.txt
```
run server:
+
```sh
python3 -m interference.app
```
@@ -254,7 +317,8 @@ if __name__ == "__main__":
main()
```
-## Models
+## Models
+
### gpt-3.5 / gpt-4
| Website| Provider| gpt-3.5 | gpt-4 | Streaming | Status | Auth |
@@ -291,6 +355,7 @@ if __name__ == "__main__":
| [supertest.lockchat.app](http://supertest.lockchat.app) | g4f.provider.Lockchat | ✔️ | ✔️ | ✔️ | ![Inactive](https://img.shields.io/badge/Inactive-red) | ❌ |
| [p5.v50.ltd](https://p5.v50.ltd) | g4f.provider.V50 | ✔️ | ❌ | ❌ | ![Inactive](https://img.shields.io/badge/Inactive-red) | ❌ |
+
### Other Models
| Model | Base Provider | Provider | Website |
@@ -397,6 +462,7 @@ if __name__ == "__main__":
## Contribute
to add another provider, its very simple:
+
1. create a new file in [g4f/provider](./g4f/provider) with the name of the Provider
2. Implement a class that extends [BaseProvider](./g4f/provider/base_provider.py).
@@ -421,8 +487,8 @@ class HogeService(BaseProvider):
```
3. Here, you can adjust the settings, for example if the website does support streaming, set `working` to `True`...
-4. Write code to request the provider in `create_completion` and `yield` the response, *even if* its a one-time response, do not hesitate to look at other providers for inspiration
-5. Add the Provider Name in [g4f/provider/__init__.py](./g4f/provider/__init__.py)
+4. Write code to request the provider in `create_completion` and `yield` the response, _even if_ its a one-time response, do not hesitate to look at other providers for inspiration
+5. Add the Provider Name in [g4f/provider/**init**.py](./g4f/provider/__init__.py)
```py
from .base_provider import BaseProvider
@@ -434,6 +500,7 @@ __all__ = [
```
6. You are done !, test the provider by calling it:
+
```py
import g4f
@@ -474,9 +541,8 @@ You should have received a copy of the GNU General Public License
along with this program. If not, see <https://www.gnu.org/licenses/>.
```
-
## Star History
<a href="https://github.com/xtekky/gpt4free/stargazers">
<img width="500" alt="Star History Chart" src="https://api.star-history.com/svg?repos=xtekky/gpt4free&type=Date">
-</a> \ No newline at end of file
+</a>
diff --git a/docker-compose.yml b/docker-compose.yml
new file mode 100644
index 00000000..9d357e46
--- /dev/null
+++ b/docker-compose.yml
@@ -0,0 +1,13 @@
+version: '3'
+
+services:
+ gpt4free:
+ build:
+ context: .
+ dockerfile: Dockerfile
+ volumes:
+ - .:/app
+ ports:
+ - '1337:1337'
+ environment:
+ - PYTHONUNBUFFERED=1