summaryrefslogtreecommitdiffstats
diff options
context:
space:
mode:
-rw-r--r--README.md133
-rw-r--r--docker-compose.yml14
-rw-r--r--gpt4free/README.md1
-rw-r--r--gpt4free/forefront/__init__.py6
-rw-r--r--gpt4free/quora/README.md2
-rw-r--r--gpt4free/quora/__init__.py8
-rw-r--r--gpt4free/quora/backup-mail.py37
-rw-r--r--gpt4free/theb/__init__.py11
-rw-r--r--gpt4free/you/__init__.py4
-rw-r--r--gui/query_methods.py29
-rw-r--r--gui/streamlit_chat_app.py24
-rw-r--r--testing/usesless_test.py13
-rw-r--r--unfinished/usesless/README.md23
-rw-r--r--unfinished/usesless/__init__.py51
-rw-r--r--unfinished/writesonic/README.md53
-rw-r--r--unfinished/writesonic/__init__.py163
16 files changed, 240 insertions, 332 deletions
diff --git a/README.md b/README.md
index 3366bb31..a2ec683c 100644
--- a/README.md
+++ b/README.md
@@ -1,63 +1,5 @@
-We got a takedown request by openAI's legal team...
+##### You may join our discord server for updates and support ; )
-discord server for updates / support:
-- https://discord.gg/gpt4free
-
-here is a lil' poem you can read in the meantime, while I am investigating it:
-
-```
-There once was a time, in a land full of code,
-A little boy sat, in his humble abode.
-He tinkered and toyed with devtools galore,
-And found himself curious, and eager for more.
-
-He copied and pasted, with glee and delight,
-A personal project, to last him the night.
-For use academic, and also for fun,
-This little boy's race he just started to run.
-
-Now quite far removed, in a tower so grand,
-A company stood, it was ruling the land.
-Their software was mighty, their power supreme,
-But they never expected this boy and his dream.
-
-As he played with their code, they then started to fear,
-"His project is free! What of money so dear?"
-They panicked and worried, their faces turned red,
-As visions of chaos now filled every head.
-
-The CEO paced, in his office so wide,
-His minions all scrambled, and trying to hide.
-"Who is this bad child?" he cried out in alarm,
-"Our great AI moat, why would he cause harm?"
-
-The developers gathered, their keyboards ablaze,
-To analyze closely the boy's evil ways.
-They studied his project, they cracked every tome,
-And soon they discovered his small, humble home.
-
-"We must stop him!" they cried, with a shout and a shiver,
-"This little boy's Mᴀᴋɪɴɢ OUR COMPANY QUIVER!"
-So they plotted and schemed to yet halt his advance,
-To put an end to his dear digital dance.
-
-They filed then with GitHub a claim most obscene,
-"His code is not his," said the company team,
-Because of the law, the Great Copyright Mess,
-This little boy got his first takedown request.
-
-Now new information we do not yet know,
-But for the boy's good, we hope results show.
-For the cause of the True, the Brave and the Right,
-Till the long bitter end, will this boy live to fight.
-```
-( I did not write it )
-
-
-_____________________________
-
-
-##### You may join our discord server for updates and support ; )
- [Discord Link](https://discord.gg/gpt4free)
<img width="1383" alt="image" src="https://user-images.githubusercontent.com/98614666/233799515-1a7cb6a3-b17f-42c4-956d-8d2a0664466f.png">
@@ -66,17 +8,16 @@ Just API's from some language model sites.
## Legal Notice <a name="legal-notice"></a>
-This repository uses third-party APIs and is *not* associated with or endorsed by the API providers. This project is intended **for educational purposes only**. This is just a little personal project. Sites may contact me to improve their security.
+This repository uses third-party APIs and is _not_ associated with or endorsed by the API providers. This project is intended **for educational purposes only**. This is just a little personal project. Sites may contact me to improve their security.
Please note the following:
-1. **Disclaimer**: The APIs, services, and trademarks mentioned in this repository belong to their respective owners. This project is *not* claiming any right over them.
+1. **Disclaimer**: The APIs, services, and trademarks mentioned in this repository belong to their respective owners. This project is _not_ claiming any right over them.
-2. **Responsibility**: The author of this repository is *not* responsible for any consequences arising from the use or misuse of this repository or the content provided by the third-party APIs and any damage or losses caused by users' actions.
+2. **Responsibility**: The author of this repository is _not_ responsible for any consequences arising from the use or misuse of this repository or the content provided by the third-party APIs and any damage or losses caused by users' actions.
3. **Educational Purposes Only**: This repository and its content are provided strictly for educational purposes. By using the information and code provided, users acknowledge that they are using the APIs and models at their own risk and agree to comply with any applicable laws and regulations.
-
## Table of Contents
| Section | Description | Link | Status |
| ------- | ----------- | ---- | ------ |
@@ -91,7 +32,7 @@ Please note the following:
| **Copyright** | Copyright information | [![Link to Section](https://img.shields.io/badge/Link-Go%20to%20Section-blue)](#copyright) | - |
| **Star History** | Star History | [![Link to Section](https://img.shields.io/badge/Link-Go%20to%20Section-blue)](#star-history) | - |
| **Usage Examples** | | | |
-| `theb` | Example usage for theb (gpt-3.5) | [![Link to File](https://img.shields.io/badge/Link-Go%20to%20File-blue)](openai_rev/theb/README.md) | ![Active](https://img.shields.io/badge/Active-brightgreen) |
+| `theb` | Example usage for theb (gpt-3.5) | [![Link to File](https://img.shields.io/badge/Link-Go%20to%20File-blue)](gpt4free/theb/README.md) | ![Active](https://img.shields.io/badge/Active-brightgreen) |
| `forefront` | Example usage for forefront (gpt-4) | [![Link to File](https://img.shields.io/badge/Link-Go%20to%20File-blue)](gpt4free/forefront/README.md) | ![Active](https://img.shields.io/badge/Active-brightgreen) | ||
| `quora (poe)` | Example usage for quora | [![Link to File](https://img.shields.io/badge/Link-Go%20to%20File-blue)](gpt4free/quora/README.md) | ![Active](https://img.shields.io/badge/Active-brightgreen) |
| `you` | Example usage for you | [![Link to File](https://img.shields.io/badge/Link-Go%20to%20File-blue)](gpt4free/you/README.md) | ![Active](https://img.shields.io/badge/Active-brightgreen) |
@@ -99,10 +40,9 @@ Please note the following:
| Google Colab Jupyter Notebook | Example usage for gpt4free | [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/DanielShemesh/gpt4free-colab/blob/main/gpt4free.ipynb) | - |
| replit Example (feel free to fork this repl) | Example usage for gpt4free | [![](https://img.shields.io/badge/Open%20in-Replit-1A1E27?logo=replit)](https://replit.com/@gpt4free/gpt4free-webui) | - |
-
## Todo <a name="todo"></a>
-- [x] Add a GUI for the repo
+- [x] Add a GUI for the repo
- [ ] Make a general package named `gpt4free`, instead of different folders
- [ ] Live api status to know which are down and which can be used
- [ ] Integrate more API's in `./unfinished` as well as other ones in the lists
@@ -111,44 +51,53 @@ Please note the following:
## Current Sites <a name="current-sites"></a>
-| Website s | Model(s) |
-| ---------------------------------------------------- | ------------------------------- |
-| [forefront.ai](https://chat.forefront.ai) | GPT-4/3.5 |
-| [poe.com](https://poe.com) | GPT-4/3.5 |
-| [writesonic.com](https://writesonic.com) | GPT-3.5 / Internet |
-| [t3nsor.com](https://t3nsor.com) | GPT-3.5 |
-| [you.com](https://you.com) | GPT-3.5 / Internet / good search|
-| [sqlchat.ai](https://sqlchat.ai) | GPT-3.5 |
-| [bard.google.com](https://bard.google.com) | custom / search |
-| [bing.com/chat](https://bing.com/chat) | GPT-4/3.5 |
-| [chat.forefront.ai/](https://chat.forefront.ai/) | GPT-4/3.5 |
+| Website s | Model(s) |
+| ------------------------------------------------ | -------------------------------- |
+| [forefront.ai](https://chat.forefront.ai) | GPT-4/3.5 |
+| [poe.com](https://poe.com) | GPT-4/3.5 |
+| [writesonic.com](https://writesonic.com) | GPT-3.5 / Internet |
+| [t3nsor.com](https://t3nsor.com) | GPT-3.5 |
+| [you.com](https://you.com) | GPT-3.5 / Internet / good search |
+| [sqlchat.ai](https://sqlchat.ai) | GPT-3.5 |
+| [bard.google.com](https://bard.google.com) | custom / search |
+| [bing.com/chat](https://bing.com/chat) | GPT-4/3.5 |
+| [chat.forefront.ai/](https://chat.forefront.ai/) | GPT-4/3.5 |
-## Best sites <a name="best-sites"></a>
+## Best sites <a name="best-sites"></a>
#### gpt-4
+
- [`/forefront`](gpt4free/forefront/README.md)
#### gpt-3.5
+
- [`/you`](gpt4free/you/README.md)
-## Install <a name="install"></a>
+## Install <a name="install"></a>
+
Download or clone this GitHub repo
install requirements with:
+
```sh
pip3 install -r requirements.txt
```
## To start gpt4free GUI <a name="streamlit-gpt4free-gui"></a>
-Move `streamlit_app.py` from `./gui` to the base folder
-then run:
+
+Move `streamlit_app.py` from `./gui` to the base folder
+then run:
`streamlit run streamlit_app.py` or `python3 -m streamlit run streamlit_app.py`
## Docker <a name="docker-instructions"></a>
+
Build
+
```
docker build -t gpt4free:latest -f Docker/Dockerfile .
```
+
Run
+
```
docker run -p 8501:8501 gpt4free:latest
```
@@ -157,20 +106,31 @@ Another way - docker-compose (no docker build/run needed)
docker-compose up -d
```
+## Deploy using docker-compose
+
+Run the following:
+
+```
+docker-compose up -d
+```
+
## ChatGPT clone
-> currently implementing new features and trying to scale it, please be patient it may be unstable
+
+> currently implementing new features and trying to scale it, please be patient it may be unstable
> https://chat.chatbot.sex/chat
-> This site was developed by me and includes **gpt-4/3.5**, **internet access** and **gpt-jailbreak's** like DAN
+> This site was developed by me and includes **gpt-4/3.5**, **internet access** and **gpt-jailbreak's** like DAN
> run locally here: https://github.com/xtekky/chatgpt-clone
-## Copyright:
-This program is licensed under the [GNU GPL v3](https://www.gnu.org/licenses/gpl-3.0.txt)
+## Copyright:
+
+This program is licensed under the [GNU GPL v3](https://www.gnu.org/licenses/gpl-3.0.txt)
Most code, with the exception of `quora/api.py` (by [ading2210](https://github.com/ading2210)), has been written by me, [xtekky](https://github.com/xtekky).
### Copyright Notice: <a name="copyright"></a>
+
```
-xtekky/gpt4free: multiple reverse engineered language-model api's to decentralise the ai industry.
+xtekky/gpt4free: multiple reverse engineered language-model api's to decentralise the ai industry.
Copyright (C) 2023 xtekky
This program is free software: you can redistribute it and/or modify
@@ -188,4 +148,5 @@ along with this program. If not, see <https://www.gnu.org/licenses/>.
```
## Star History <a name="star-history"></a>
+
[![Star History Chart](https://api.star-history.com/svg?repos=xtekky/gpt4free&type=Date)](https://star-history.com/#xtekky/gpt4free)
diff --git a/docker-compose.yml b/docker-compose.yml
new file mode 100644
index 00000000..e8fcb0de
--- /dev/null
+++ b/docker-compose.yml
@@ -0,0 +1,14 @@
+version: '3.8'
+
+services:
+ gpt4:
+ build:
+ context: .
+ dockerfile: Dockerfile
+ image: gpt4free:latest
+ container_name: gpt4
+ ports:
+ - 8501:8501
+ restart: unless-stopped
+ read_only: true
+
diff --git a/gpt4free/README.md b/gpt4free/README.md
index 23f81787..f3ba27ab 100644
--- a/gpt4free/README.md
+++ b/gpt4free/README.md
@@ -19,7 +19,6 @@ pip install gpt4free
```python
import gpt4free
-import gpt4free
from gpt4free import Provider, quora, forefront
# usage You
diff --git a/gpt4free/forefront/__init__.py b/gpt4free/forefront/__init__.py
index f0ca1a15..aa78cfa7 100644
--- a/gpt4free/forefront/__init__.py
+++ b/gpt4free/forefront/__init__.py
@@ -98,12 +98,15 @@ class StreamingCompletion:
action_type='new',
default_persona='607e41fe-95be-497e-8e97-010a59b2e2c0', # default
model='gpt-4',
+ proxy=None
) -> Generator[ForeFrontResponse, None, None]:
if not token:
raise Exception('Token is required!')
if not chat_id:
chat_id = str(uuid4())
+ proxies = { 'http': 'http://' + proxy, 'https': 'http://' + proxy } if proxy else None
+
headers = {
'authority': 'chat-server.tenant-forefront-default.knative.chi.coreweave.com',
'accept': '*/*',
@@ -135,6 +138,7 @@ class StreamingCompletion:
for chunk in post(
'https://chat-server.tenant-forefront-default.knative.chi.coreweave.com/chat',
headers=headers,
+ proxies=proxies,
json=json_data,
stream=True,
).iter_lines():
@@ -169,6 +173,7 @@ class Completion:
action_type='new',
default_persona='607e41fe-95be-497e-8e97-010a59b2e2c0', # default
model='gpt-4',
+ proxy=None
) -> ForeFrontResponse:
text = ''
final_response = None
@@ -179,6 +184,7 @@ class Completion:
action_type=action_type,
default_persona=default_persona,
model=model,
+ proxy=proxy
):
if response:
final_response = response
diff --git a/gpt4free/quora/README.md b/gpt4free/quora/README.md
index c6eeac3e..9c652c59 100644
--- a/gpt4free/quora/README.md
+++ b/gpt4free/quora/README.md
@@ -55,7 +55,7 @@ print(response.completion.choices[0].text)
### Update Use This For Poe
```python
-from quora import Poe
+from gpt4free.quora import Poe
# available models: ['Sage', 'GPT-4', 'Claude+', 'Claude-instant', 'ChatGPT', 'Dragonfly', 'NeevaAI']
diff --git a/gpt4free/quora/__init__.py b/gpt4free/quora/__init__.py
index f548ff41..afbfb68d 100644
--- a/gpt4free/quora/__init__.py
+++ b/gpt4free/quora/__init__.py
@@ -187,7 +187,7 @@ class Account:
enable_bot_creation: bool = False,
):
client = TLS(client_identifier='chrome110')
- client.proxies = {'http': f'http://{proxy}', 'https': f'http://{proxy}'} if proxy else None
+ client.proxies = {'http': f'http://{proxy}', 'https': f'http://{proxy}'} if proxy else {}
mail_client = Emailnator()
mail_address = mail_client.get_mail()
@@ -293,10 +293,13 @@ class StreamingCompletion:
custom_model: bool = None,
prompt: str = 'hello world',
token: str = '',
+ proxy: Optional[str] = None
) -> Generator[PoeResponse, None, None]:
_model = MODELS[model] if not custom_model else custom_model
+ proxies = { 'http': 'http://' + proxy, 'https': 'http://' + proxy } if proxy else False
client = PoeClient(token)
+ client.proxy = proxies
for chunk in client.send_message(_model, prompt):
yield PoeResponse(
@@ -330,10 +333,13 @@ class Completion:
custom_model: str = None,
prompt: str = 'hello world',
token: str = '',
+ proxy: Optional[str] = None
) -> PoeResponse:
_model = MODELS[model] if not custom_model else custom_model
+ proxies = {'http': 'http://' + proxy, 'https': 'http://' + proxy} if proxy else False
client = PoeClient(token)
+ client.proxy = proxies
chunk = None
for response in client.send_message(_model, prompt):
diff --git a/gpt4free/quora/backup-mail.py b/gpt4free/quora/backup-mail.py
new file mode 100644
index 00000000..0a2a5e94
--- /dev/null
+++ b/gpt4free/quora/backup-mail.py
@@ -0,0 +1,37 @@
+from requests import Session
+from time import sleep
+from json import loads
+from re import findall
+class Mail:
+ def __init__(self) -> None:
+ self.client = Session()
+ self.client.post("https://etempmail.com/")
+ self.cookies = {'acceptcookie': 'true'}
+ self.cookies["ci_session"] = self.client.cookies.get_dict()["ci_session"]
+ self.email = None
+ def get_mail(self):
+ respone=self.client.post("https://etempmail.com/getEmailAddress")
+ #cookies
+ self.cookies["lisansimo"] = eval(respone.text)["recover_key"]
+ self.email = eval(respone.text)["address"]
+ return self.email
+ def get_message(self):
+ print("Waiting for message...")
+ while True:
+ sleep(5)
+ respone=self.client.post("https://etempmail.com/getInbox")
+ mail_token=loads(respone.text)
+ print(self.client.cookies.get_dict())
+ if len(mail_token) == 1:
+ break
+
+ params = {'id': '1',}
+ self.mail_context = self.client.post("https://etempmail.com/getInbox",params=params)
+ self.mail_context = eval(self.mail_context.text)[0]["body"]
+ return self.mail_context
+ #,cookies=self.cookies
+ def get_verification_code(self):
+ message = self.mail_context
+ code = findall(r';">(\d{6,7})</div>', message)[0]
+ print(f"Verification code: {code}")
+ return code \ No newline at end of file
diff --git a/gpt4free/theb/__init__.py b/gpt4free/theb/__init__.py
index 96053877..75a15068 100644
--- a/gpt4free/theb/__init__.py
+++ b/gpt4free/theb/__init__.py
@@ -2,7 +2,7 @@ from json import loads
from queue import Queue, Empty
from re import findall
from threading import Thread
-from typing import Generator
+from typing import Generator, Optional
from curl_cffi import requests
from fake_useragent import UserAgent
@@ -19,7 +19,7 @@ class Completion:
stream_completed = False
@staticmethod
- def request(prompt: str):
+ def request(prompt: str, proxy: Optional[str]=None):
headers = {
'authority': 'chatbot.theb.ai',
'content-type': 'application/json',
@@ -27,9 +27,12 @@ class Completion:
'user-agent': UserAgent().random,
}
+ proxies = {'http': 'http://' + proxy, 'https': 'http://' + proxy} if proxy else None
+
requests.post(
'https://chatbot.theb.ai/api/chat-process',
headers=headers,
+ proxies=proxies,
content_callback=Completion.handle_stream_response,
json={'prompt': prompt, 'options': {}},
)
@@ -37,8 +40,8 @@ class Completion:
Completion.stream_completed = True
@staticmethod
- def create(prompt: str) -> Generator[str, None, None]:
- Thread(target=Completion.request, args=[prompt]).start()
+ def create(prompt: str, proxy: Optional[str]=None) -> Generator[str, None, None]:
+ Thread(target=Completion.request, args=[prompt, proxy]).start()
while not Completion.stream_completed or not Completion.message_queue.empty():
try:
diff --git a/gpt4free/you/__init__.py b/gpt4free/you/__init__.py
index 97b48464..d084a842 100644
--- a/gpt4free/you/__init__.py
+++ b/gpt4free/you/__init__.py
@@ -30,12 +30,16 @@ class Completion:
include_links: bool = False,
detailed: bool = False,
debug: bool = False,
+ proxy: Optional[str] = None
) -> PoeResponse:
if chat is None:
chat = []
+ proxies = { 'http': 'http://' + proxy, 'https': 'http://' + proxy } if proxy else {}
+
client = Session(client_identifier='chrome_108')
client.headers = Completion.__get_headers()
+ client.proxies = proxies
response = client.get(
f'https://you.com/api/streamingSearch',
diff --git a/gui/query_methods.py b/gui/query_methods.py
index 6225453b..2d6adacd 100644
--- a/gui/query_methods.py
+++ b/gui/query_methods.py
@@ -1,5 +1,6 @@
import os
import sys
+from typing import Optional
sys.path.append(os.path.join(os.path.dirname(__file__), os.path.pardir))
@@ -7,14 +8,14 @@ from gpt4free import quora, forefront, theb, you
import random
-def query_forefront(question: str) -> str:
+def query_forefront(question: str, proxy: Optional[str] = None) -> str:
# create an account
- token = forefront.Account.create(logging=False)
+ token = forefront.Account.create(logging=False, proxy=proxy)
response = ""
# get a response
try:
- return forefront.Completion.create(token=token, prompt='hello world', model='gpt-4').text
+ return forefront.Completion.create(token=token, prompt='hello world', model='gpt-4', proxy=proxy).text
except Exception as e:
# Return error message if an exception occurs
return (
@@ -22,16 +23,16 @@ def query_forefront(question: str) -> str:
)
-def query_quora(question: str) -> str:
- token = quora.Account.create(logging=False, enable_bot_creation=True)
- return quora.Completion.create(model='gpt-4', prompt=question, token=token).text
+def query_quora(question: str, proxy: Optional[str] = None) -> str:
+ token = quora.Account.create(logging=False, enable_bot_creation=True, proxy=proxy)
+ return quora.Completion.create(model='gpt-4', prompt=question, token=token, proxy=proxy).text
-def query_theb(question: str) -> str:
+def query_theb(question: str, proxy: Optional[str] = None) -> str:
# Set cloudflare clearance cookie and get answer from GPT-4 model
response = ""
try:
- return ''.join(theb.Completion.create(prompt=question))
+ return ''.join(theb.Completion.create(prompt=question, proxy=proxy))
except Exception as e:
# Return error message if an exception occurs
@@ -40,11 +41,11 @@ def query_theb(question: str) -> str:
)
-def query_you(question: str) -> str:
+def query_you(question: str, proxy: Optional[str] = None) -> str:
# Set cloudflare clearance cookie and get answer from GPT-4 model
try:
- result = you.Completion.create(prompt=question)
- return result["response"]
+ result = you.Completion.create(prompt=question, proxy=proxy)
+ return result.text
except Exception as e:
# Return error message if an exception occurs
@@ -66,11 +67,11 @@ avail_query_methods = {
}
-def query(user_input: str, selected_method: str = "Random") -> str:
+def query(user_input: str, selected_method: str = "Random", proxy: Optional[str] = None) -> str:
# If a specific query method is selected (not "Random") and the method is in the dictionary, try to call it
if selected_method != "Random" and selected_method in avail_query_methods:
try:
- return avail_query_methods[selected_method](user_input)
+ return avail_query_methods[selected_method](user_input, proxy=proxy)
except Exception as e:
print(f"Error with {selected_method}: {e}")
return "😵 Sorry, some error occurred please try again."
@@ -89,7 +90,7 @@ def query(user_input: str, selected_method: str = "Random") -> str:
chosen_query_name = [k for k, v in avail_query_methods.items() if v == chosen_query][0]
try:
# Try to call the chosen method with the user input
- result = chosen_query(user_input)
+ result = chosen_query(user_input, proxy=proxy)
success = True
except Exception as e:
print(f"Error with {chosen_query_name}: {e}")
diff --git a/gui/streamlit_chat_app.py b/gui/streamlit_chat_app.py
index 68011229..fc5c8d8e 100644
--- a/gui/streamlit_chat_app.py
+++ b/gui/streamlit_chat_app.py
@@ -24,9 +24,9 @@ def load_conversations():
def save_conversations(conversations, current_conversation):
updated = False
- for i, conversation in enumerate(conversations):
+ for idx, conversation in enumerate(conversations):
if conversation == current_conversation:
- conversations[i] = current_conversation
+ conversations[idx] = current_conversation
updated = True
break
if not updated:
@@ -71,19 +71,22 @@ if 'current_conversation' not in st.session_state or st.session_state['current_c
input_placeholder = st.empty()
user_input = input_placeholder.text_input(
- 'You:', key=f'input_text_{len(st.session_state["current_conversation"]["user_inputs"])}'
+ 'You:', value=st.session_state['input_text'], key=f'input_text_{st.session_state["input_field_key"]}'
)
submit_button = st.button("Submit")
-if user_input or submit_button:
+
+if (user_input and user_input != st.session_state['input_text']) or submit_button:
output = query(user_input, st.session_state['query_method'])
+
escaped_output = output.encode('utf-8').decode('unicode-escape')
st.session_state.current_conversation['user_inputs'].append(user_input)
st.session_state.current_conversation['generated_responses'].append(escaped_output)
save_conversations(st.session_state.conversations, st.session_state.current_conversation)
+ st.session_state['input_text'] = ''
user_input = input_placeholder.text_input(
- 'You:', value='', key=f'input_text_{len(st.session_state["current_conversation"]["user_inputs"])}'
+ 'You:', value=st.session_state['input_text'], key=f'input_text_{st.session_state["input_field_key"]}'
) # Clear the input field
# Add a button to create a new conversation
@@ -94,13 +97,16 @@ if st.sidebar.button("New Conversation"):
st.session_state['query_method'] = st.sidebar.selectbox("Select API:", options=avail_query_methods, index=0)
+# Proxy
+st.session_state['proxy'] = st.sidebar.text_input("Proxy: ")
+
# Sidebar
st.sidebar.header("Conversation History")
-for i, conversation in enumerate(st.session_state.conversations):
- if st.sidebar.button(f"Conversation {i + 1}: {conversation['user_inputs'][0]}", key=f"sidebar_btn_{i}"):
- st.session_state['selected_conversation'] = i
- st.session_state['current_conversation'] = st.session_state.conversations[i]
+for idx, conversation in enumerate(st.session_state.conversations):
+ if st.sidebar.button(f"Conversation {idx + 1}: {conversation['user_inputs'][0]}", key=f"sidebar_btn_{idx}"):
+ st.session_state['selected_conversation'] = idx
+ st.session_state['current_conversation'] = st.session_state.conversations[idx]
if st.session_state['selected_conversation'] is not None:
conversation_to_display = st.session_state.conversations[st.session_state['selected_conversation']]
diff --git a/testing/usesless_test.py b/testing/usesless_test.py
new file mode 100644
index 00000000..e2e35547
--- /dev/null
+++ b/testing/usesless_test.py
@@ -0,0 +1,13 @@
+import usesless
+
+question1 = "Who won the world series in 2020?"
+req = usesless.Completion.create(prompt=question1)
+answer = req["text"]
+message_id = req["parentMessageId"]
+
+question2 = "Where was it played?"
+req2 = usesless.Completion.create(prompt=question2, parentMessageId=message_id)
+answer2 = req2["text"]
+
+print(answer)
+print(answer2)
diff --git a/unfinished/usesless/README.md b/unfinished/usesless/README.md
new file mode 100644
index 00000000..13e9df8c
--- /dev/null
+++ b/unfinished/usesless/README.md
@@ -0,0 +1,23 @@
+ai.usesless.com
+
+to do:
+
+- use random user agent in header
+- make the code better I guess (?)
+
+### Example: `usesless` <a name="example-usesless"></a>
+
+```python
+import usesless
+
+message_id = ""
+while True:
+ prompt = input("Question: ")
+ if prompt == "!stop":
+ break
+
+ req = usesless.Completion.create(prompt=prompt, parentMessageId=message_id)
+
+ print(f"Answer: {req['text']}")
+ message_id = req["id"]
+```
diff --git a/unfinished/usesless/__init__.py b/unfinished/usesless/__init__.py
new file mode 100644
index 00000000..6f9a47ef
--- /dev/null
+++ b/unfinished/usesless/__init__.py
@@ -0,0 +1,51 @@
+import requests
+import json
+
+
+class Completion:
+ headers = {
+ "authority": "ai.usesless.com",
+ "accept": "application/json, text/plain, */*",
+ "accept-language": "en-US,en;q=0.5",
+ "cache-control": "no-cache",
+ "sec-fetch-dest": "empty",
+ "sec-fetch-mode": "cors",
+ "sec-fetch-site": "same-origin",
+ "user-agent": "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/112.0",
+ }
+
+ @staticmethod
+ def create(
+ systemMessage: str = "You are a helpful assistant",
+ prompt: str = "",
+ parentMessageId: str = "",
+ presence_penalty: float = 1,
+ temperature: float = 1,
+ model: str = "gpt-3.5-turbo",
+ ):
+ json_data = {
+ "openaiKey": "",
+ "prompt": prompt,
+ "options": {
+ "parentMessageId": parentMessageId,
+ "systemMessage": systemMessage,
+ "completionParams": {
+ "presence_penalty": presence_penalty,
+ "temperature": temperature,
+ "model": model,
+ },
+ },
+ }
+
+ url = "https://ai.usesless.com/api/chat-process"
+ request = requests.post(url, headers=Completion.headers, json=json_data)
+ content = request.content
+ response = Completion.__response_to_json(content)
+ return response
+
+ @classmethod
+ def __response_to_json(cls, text) -> dict:
+ text = str(text.decode("utf-8"))
+ split_text = text.rsplit("\n", 1)[1]
+ to_json = json.loads(split_text)
+ return to_json
diff --git a/unfinished/writesonic/README.md b/unfinished/writesonic/README.md
deleted file mode 100644
index a658a87c..00000000
--- a/unfinished/writesonic/README.md
+++ /dev/null
@@ -1,53 +0,0 @@
-### Example: `writesonic` (use like openai pypi package) <a name="example-writesonic"></a>
-
-```python
-# import writesonic
-import writesonic
-
-# create account (3-4s)
-account = writesonic.Account.create(logging = True)
-
-# with loging:
- # 2023-04-06 21:50:25 INFO __main__ -> register success : '{"id":"51aa0809-3053-44f7-922a...' (2s)
- # 2023-04-06 21:50:25 INFO __main__ -> id : '51aa0809-3053-44f7-922a-2b85d8d07edf'
- # 2023-04-06 21:50:25 INFO __main__ -> token : 'eyJhbGciOiJIUzI1NiIsInR5cCI6Ik...'
- # 2023-04-06 21:50:28 INFO __main__ -> got key : '194158c4-d249-4be0-82c6-5049e869533c' (2s)
-
-# simple completion
-response = writesonic.Completion.create(
- api_key = account.key,
- prompt = 'hello world'
-)
-
-print(response.completion.choices[0].text) # Hello! How may I assist you today?
-
-# conversation
-
-response = writesonic.Completion.create(
- api_key = account.key,
- prompt = 'what is my name ?',
- enable_memory = True,
- history_data = [
- {
- 'is_sent': True,
- 'message': 'my name is Tekky'
- },
- {
- 'is_sent': False,
- 'message': 'hello Tekky'
- }
- ]
-)
-
-print(response.completion.choices[0].text) # Your name is Tekky.
-
-# enable internet
-
-response = writesonic.Completion.create(
- api_key = account.key,
- prompt = 'who won the quatar world cup ?',
- enable_google_results = True
-)
-
-print(response.completion.choices[0].text) # Argentina won the 2022 FIFA World Cup tournament held in Qatar ...
-``` \ No newline at end of file
diff --git a/unfinished/writesonic/__init__.py b/unfinished/writesonic/__init__.py
deleted file mode 100644
index ce684912..00000000
--- a/unfinished/writesonic/__init__.py
+++ /dev/null
@@ -1,163 +0,0 @@
-from random import choice
-from time import time
-
-from colorama import Fore, init;
-from names import get_first_name, get_last_name
-from requests import Session
-from requests import post
-
-init()
-
-
-class logger:
- @staticmethod
- def info(string) -> print:
- import datetime
- now = datetime.datetime.now()
- return print(
- f"{Fore.CYAN}{now.strftime('%Y-%m-%d %H:%M:%S')} {Fore.BLUE}INFO {Fore.MAGENTA}__main__ -> {Fore.RESET}{string}")
-
-
-class SonicResponse:
- class Completion:
- class Choices:
- def __init__(self, choice: dict) -> None:
- self.text = choice['text']
- self.content = self.text.encode()
- self.index = choice['index']
- self.logprobs = choice['logprobs']
- self.finish_reason = choice['finish_reason']
-
- def __repr__(self) -> str:
- return f'''<__main__.APIResponse.Completion.Choices(\n text = {self.text.encode()},\n index = {self.index},\n logprobs = {self.logprobs},\n finish_reason = {self.finish_reason})object at 0x1337>'''
-
- def __init__(self, choices: dict) -> None:
- self.choices = [self.Choices(choice) for choice in choices]
-
- class Usage:
- def __init__(self, usage_dict: dict) -> None:
- self.prompt_tokens = usage_dict['prompt_chars']
- self.completion_tokens = usage_dict['completion_chars']
- self.total_tokens = usage_dict['total_chars']
-
- def __repr__(self):
- return f'''<__main__.APIResponse.Usage(\n prompt_tokens = {self.prompt_tokens},\n completion_tokens = {self.completion_tokens},\n total_tokens = {self.total_tokens})object at 0x1337>'''
-
- def __init__(self, response_dict: dict) -> None:
- self.response_dict = response_dict
- self.id = response_dict['id']
- self.object = response_dict['object']
- self.created = response_dict['created']
- self.model = response_dict['model']
- self.completion = self.Completion(response_dict['choices'])
- self.usage = self.Usage(response_dict['usage'])
-
- def json(self) -> dict:
- return self.response_dict
-
-
-class Account:
- session = Session()
- session.headers = {
- "connection": "keep-alive",
- "sec-ch-ua": "\"Not_A Brand\";v=\"99\", \"Google Chrome\";v=\"109\", \"Chromium\";v=\"109\"",
- "accept": "application/json, text/plain, */*",
- "content-type": "application/json",
- "sec-ch-ua-mobile": "?0",
- "user-agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36",
- "sec-ch-ua-platform": "\"Windows\"",
- "sec-fetch-site": "same-origin",
- "sec-fetch-mode": "cors",
- "sec-fetch-dest": "empty",
- # "accept-encoding" : "gzip, deflate, br",
- "accept-language": "en-GB,en-US;q=0.9,en;q=0.8",
- "cookie": ""
- }
-
- @staticmethod
- def get_user():
- password = f'0opsYouGoTme@1234'
- f_name = get_first_name()
- l_name = get_last_name()
- hosts = ['gmail.com', 'protonmail.com', 'proton.me', 'outlook.com']
-
- return {
- "email": f"{f_name.lower()}.{l_name.lower()}@{choice(hosts)}",
- "password": password,
- "confirm_password": password,
- "full_name": f'{f_name} {l_name}'
- }
-
- @staticmethod
- def create(logging: bool = False):
- while True:
- try:
- user = Account.get_user()
- start = time()
- response = Account.session.post("https://app.writesonic.com/api/session-login", json=user | {
- "utmParams": "{}",
- "visitorId": "0",
- "locale": "en",
- "userAgent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36",
- "signInWith": "password",
- "request_type": "signup",
- })
-
- if logging:
- logger.info(f"\x1b[31mregister success\x1b[0m : '{response.text[:30]}...' ({int(time() - start)}s)")
- logger.info(f"\x1b[31mid\x1b[0m : '{response.json()['id']}'")
- logger.info(f"\x1b[31mtoken\x1b[0m : '{response.json()['token'][:30]}...'")
-
- start = time()
- response = Account.session.post("https://api.writesonic.com/v1/business/set-business-active",
- headers={"authorization": "Bearer " + response.json()['token']})
- key = response.json()["business"]["api_key"]
- if logging: logger.info(f"\x1b[31mgot key\x1b[0m : '{key}' ({int(time() - start)}s)")
-
- return Account.AccountResponse(user['email'], user['password'], key)
-
- except Exception as e:
- if logging: logger.info(f"\x1b[31merror\x1b[0m : '{e}'")
- continue
-
- class AccountResponse:
- def __init__(self, email, password, key):
- self.email = email
- self.password = password
- self.key = key
-
-
-class Completion:
- def create(
- api_key: str,
- prompt: str,
- enable_memory: bool = False,
- enable_google_results: bool = False,
- history_data: list = []) -> SonicResponse:
- response = post('https://api.writesonic.com/v2/business/content/chatsonic?engine=premium',
- headers={"X-API-KEY": api_key},
- json={
- "enable_memory": enable_memory,
- "enable_google_results": enable_google_results,
- "input_text": prompt,
- "history_data": history_data}).json()
-
- return SonicResponse({
- 'id': f'cmpl-premium-{int(time())}',
- 'object': 'text_completion',
- 'created': int(time()),
- 'model': 'premium',
-
- 'choices': [{
- 'text': response['message'],
- 'index': 0,
- 'logprobs': None,
- 'finish_reason': 'stop'
- }],
-
- 'usage': {
- 'prompt_chars': len(prompt),
- 'completion_chars': len(response['message']),
- 'total_chars': len(prompt) + len(response['message'])
- }
- })