Compare commits
34 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
dbf45196ee | ||
|
|
45a2aaf7b8 | ||
|
|
c02c62f3b1 | ||
|
|
4718674688 | ||
|
|
0e3c115368 | ||
|
|
cc506c23af | ||
|
|
1f77d94b7e | ||
|
|
5711ff27ee | ||
|
|
3120602d6a | ||
|
|
9f36e195bc | ||
|
|
9de7da91a7 | ||
|
|
501a15a18f | ||
|
|
6049c9e3ff | ||
|
|
262b402606 | ||
|
|
56ea9563b8 | ||
|
|
2cd6612620 | ||
|
|
5d40396fb2 | ||
|
|
93dd1eb036 | ||
|
|
542a46dc7c | ||
|
|
bf31b1fea0 | ||
|
|
25d4529ff9 | ||
|
|
33d7c67c04 | ||
|
|
dc8f762bac | ||
|
|
49041e16c7 | ||
|
|
3414690e42 | ||
|
|
95c97561ae | ||
|
|
8bb4d7d590 | ||
|
|
94ad31dce3 | ||
|
|
91e9b167b3 | ||
|
|
53ea3dd9fb | ||
|
|
c72a3a0362 | ||
|
|
bd068c9a5a | ||
|
|
7997c3137a | ||
|
|
b466b36e7a |
@@ -1,2 +1,3 @@
|
||||
**/node_modules
|
||||
**/.env
|
||||
api/.env
|
||||
client/dist/images
|
||||
1
.gitignore
vendored
1
.gitignore
vendored
@@ -59,6 +59,7 @@ src/style - official.css
|
||||
/playwright/.cache/
|
||||
.DS_Store
|
||||
*.code-workspace
|
||||
.idea
|
||||
|
||||
# meilisearch
|
||||
meilisearch
|
||||
|
||||
96
CHANGELOG.md
96
CHANGELOG.md
@@ -1,5 +1,93 @@
|
||||
# # Changelog
|
||||
<details open>
|
||||
<summary><strong>2023-05-14</strong></summary>
|
||||
|
||||
**Released [v0.4.4](https://github.com/danny-avila/chatgpt-clone/releases/tag/v0.4.4):**
|
||||
|
||||
1. The Msg Clipboard was changed to a checkmark for improved user experience by @techwithanirudh in PR [#247](https://github.com/danny-avila/chatgpt-clone/pull/247).
|
||||
2. A typo in the auth.json path for accessing Google Palm was corrected by @antonme in PR [#266](https://github.com/danny-avila/chatgpt-clone/pull/266).
|
||||
3. @techwithanirudh added a Popup Menu to save sidebar space in PR [#260](https://github.com/danny-avila/chatgpt-clone/pull/260).
|
||||
4. The default pageSize in Conversation.js was increased from 12 to 14 by @danny-avila in PR [#267](https://github.com/danny-avila/chatgpt-clone/pull/267).
|
||||
5. Fonts were updated by @techwithanirudh in PR [#261](https://github.com/danny-avila/chatgpt-clone/pull/261).
|
||||
6. Font file paths in style.css were changed by @danny-avila in PR [#268](https://github.com/danny-avila/chatgpt-clone/pull/268).
|
||||
7. Code was fixed to adjust max_tokens according to model selection by @p4w4n in PR [#263](https://github.com/danny-avila/chatgpt-clone/pull/263).
|
||||
8. Various improvements were made, such as fixing react errors and adjusting the mobile view, by @danny-avila in PR [#269](https://github.com/danny-avila/chatgpt-clone/pull/269).
|
||||
|
||||
New contributors to the project include:
|
||||
|
||||
- @techwithanirudh, who made their first contribution in PR [#247](https://github.com/danny-avila/chatgpt-clone/pull/247).
|
||||
- @antonme, who made their first contribution in PR [#266](https://github.com/danny-avila/chatgpt-clone/pull/266).
|
||||
- @p4w4n, who made their first contribution in PR [#263](https://github.com/danny-avila/chatgpt-clone/pull/263).
|
||||
|
||||
The [full changelog can be found here](https://github.com/danny-avila/chatgpt-clone/compare/v0.4.3...v0.4.4)
|
||||
</details>
|
||||
<details>
|
||||
<summary><strong>2023-05-13</strong></summary>
|
||||
|
||||
**Released [v0.4.3](https://github.com/danny-avila/chatgpt-clone/releases/tag/v0.4.3) which now supports Google's PaLM 2!**
|
||||
|
||||

|
||||
|
||||
**How to Setup PaLM 2 (via Google Cloud Vertex AI API)**
|
||||
|
||||
- Enable the Vertex AI API on Google Cloud:
|
||||
- - https://console.cloud.google.com/vertex-ai
|
||||
- Create a Service Account:
|
||||
- - https://console.cloud.google.com/projectselector/iam-admin/serviceaccounts/create?walkthrough_id=iam--create-service-account#step_index=1
|
||||
- Make sure to click 'Create and Continue' to give at least the 'Vertex AI User' role.
|
||||
- Create a JSON key, rename as 'auth.json' and save it in /api/data/.
|
||||
|
||||
**Alternatively**
|
||||
|
||||
- In your ./api/.env file, set PALM_KEY as "user_provided" to allow the user to provide a Service Account key JSON from the UI.
|
||||
- They will follow the steps above except for renaming the file, simply importing the JSON when prompted.
|
||||
- The key is sent to the server but never saved except in your local storage
|
||||
|
||||
**Note:**
|
||||
|
||||
- Vertex AI does not (yet) support response streaming for text generations, so response may seem to take long when generating a lot of text.
|
||||
- Text streaming is simulated
|
||||
|
||||
|
||||
You can check the full changelog in between v0.4.2 and v0.4.3 [here](https://github.com/danny-avila/chatgpt-clone/compare/v0.4.2...v0.4.3).
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>2023-05-11</strong></summary>
|
||||
|
||||
**Released [v0.4.2](https://github.com/danny-avila/chatgpt-clone/releases/tag/v0.4.2)**
|
||||
|
||||
ChatGPT-Clone received some important upgrades and improvements. A new contributor, [@qcgm1978](https://github.com/qcgm1978), makes their first contribution by adding a null check for adaptiveCards variable. Additionally, support for titling conversations with the Azure endpoint is added by [@danny-avila](https://github.com/danny-avila) in PR [#234](https://github.com/danny-avila/chatgpt-clone/pull/234). In PR [#235](https://github.com/danny-avila/chatgpt-clone/pull/235), [@danny-avila](https://github.com/danny-avila) also makes some necessary fixes to titling, quotation marks, and endpoints being unavailable with only the Azure key provided. The logging system is now powered by Pino and sanitization, thanks to [@danorlando](https://github.com/danorlando) in PR [#227](https://github.com/danny-avila/chatgpt-clone/pull/227). To bulletproof the Docker container, the .dockerignore file is updated to include the client/.env file by [@danny-avila](https://github.com/danny-avila) in PR [#241](https://github.com/danny-avila/chatgpt-clone/pull/241). This issue was brought to our attention on discord.
|
||||
|
||||
There is active work on the new Plugins feature, converting the frontend to Typescript, and looking to integrate Palm2, google's new generative AI accessible via API, to the project as a new endpoint.
|
||||
|
||||
You can check the full changelog in between [v0.4.1](https://github.com/danny-avila/chatgpt-clone/releases/tag/v0.4.1) and [v0.4.2](https://github.com/danny-avila/chatgpt-clone/releases/tag/v0.4.2) [here](https://github.com/danny-avila/chatgpt-clone/compare/v0.4.1...v0.4.2)."
|
||||
|
||||
For discussion and suggestion you can join us: **[community discord server](https://discord.gg/NGaa9RPCft)**
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>2023-05-09</strong></summary>
|
||||
|
||||
**Released [v0.4.1](https://github.com/danny-avila/chatgpt-clone/releases/tag/v0.4.1)**
|
||||
|
||||
* update user system section of readme by @danorlando in #207
|
||||
* remove github-passport and update package.lock files by @danorlando in #208
|
||||
* Update README.md by @fuegovic in #209
|
||||
* fix: fix browser refresh redirecting to /chat/new by @danorlando in #210
|
||||
* fix: fix issue with validation when google account has multiple spaces in username by @danorlando in #211
|
||||
* chore: update docker image version to use latest by @danny-avila in #218
|
||||
* update documentation structure by @fuegovic in #220
|
||||
* Feat: Add Azure support by @danny-avila in #219
|
||||
* Update Message.js by @DavidDev1334 in #191
|
||||
|
||||
⚠️ **IMPORTANT :** Since V0.4.0 You should register and login with a local account (email and password) for the first time sign-up. if you use login for the first time with a social login account (eg. Google, facebook, etc.), the conversations and presets that you created before the user system was implemented will NOT be migrated to that account.
|
||||
|
||||
⚠️ **Breaking - new Env Variables :** Since V0.4.0 You will need to add the new env variables from .env.example for the app to work, even if you're not using multiple users for your purposes.
|
||||
|
||||
For discussion and suggestion you can join us: **[community discord server](https://discord.gg/NGaa9RPCft)**
|
||||
</details>
|
||||
<details>
|
||||
<summary><strong>2023-05-07</strong></summary>
|
||||
|
||||
**Released [v0.4.0](https://github.com/danny-avila/chatgpt-clone/releases/tag/v0.4.0)**, Introducing User/Auth System and OAuth2/Social Login! You can now register and login with an email account or use Google login. Your your previous conversations and presets will migrate to your new profile upon creation. Check out the details in the [User/Auth System](#userauth-system) section of the README.md.
|
||||
@@ -10,11 +98,7 @@
|
||||
|
||||
For discussion and suggestion you can join us: **[community discord server](https://discord.gg/NGaa9RPCft)**
|
||||
</details>
|
||||
|
||||
|
||||
<details>
|
||||
<summary><strong>Previous Updates</strong></summary>
|
||||
|
||||
|
||||
|
||||
<details>
|
||||
<summary><strong>2023-04-05</strong></summary>
|
||||
@@ -45,7 +129,7 @@ The above features are next and then I will have to focus on building the **test
|
||||
|
||||
On that note, I had to switch the default branch due to some breaking changes that haven't been straight forward to debug, mainly related to node-chat-gpt the main dependency of the project. Thankfully, my working branch, now switched to default as main, is working as expected.
|
||||
</details>
|
||||
</details>
|
||||
|
||||
|
||||
##
|
||||
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
# Contributors List
|
||||
|
||||
Here is a list of all contributors to this project:
|
||||
We appreciate all the contributors who helped make this project possible:
|
||||
|
||||
- danny-avila (Admin)
|
||||
- wtlyu (Contributor)
|
||||
@@ -8,11 +8,12 @@ Here is a list of all contributors to this project:
|
||||
- alfredo-f (Contributor)
|
||||
- HyunggyuJang (Contributor)
|
||||
- fuegovic (Contributor)
|
||||
- DavidDev1334
|
||||
- toordog (Contributor)
|
||||
- heathriel (External Contributor)
|
||||
- hackreactor-bot (Contributor)
|
||||
- git-bruh (Contributor)
|
||||
zhangsean (Contributor)
|
||||
- zhangsean (Contributor)
|
||||
- llk89 (Contributor)
|
||||
- adamrb (Contributor)
|
||||
|
||||
|
||||
31
Dockerfile
31
Dockerfile
@@ -1,37 +1,32 @@
|
||||
FROM node:19-alpine AS react-client
|
||||
# Base node image
|
||||
FROM node:19-alpine AS base
|
||||
WORKDIR /api
|
||||
COPY /api/package*.json /api/
|
||||
WORKDIR /client
|
||||
# copy package.json into the container at /client
|
||||
COPY /client/package*.json /client/
|
||||
# install dependencies
|
||||
WORKDIR /
|
||||
COPY /package*.json /
|
||||
RUN npm ci
|
||||
# Copy the current directory contents into the container at /client
|
||||
|
||||
# React client build
|
||||
FROM base AS react-client
|
||||
WORKDIR /client
|
||||
COPY /client/ /client/
|
||||
# Set the memory limit for Node.js
|
||||
ENV NODE_OPTIONS="--max-old-space-size=2048"
|
||||
# Build artifacts
|
||||
RUN npm run build
|
||||
|
||||
FROM node:19-alpine AS node-api
|
||||
# Node API setup
|
||||
FROM base AS node-api
|
||||
WORKDIR /api
|
||||
# copy package.json into the container at /api
|
||||
COPY /api/package*.json /api/
|
||||
# install dependencies
|
||||
RUN npm ci
|
||||
# Copy the current directory contents into the container at /api
|
||||
COPY /api/ /api/
|
||||
# Copy the client side code
|
||||
COPY --from=react-client /client/dist /client/dist
|
||||
# Make port 3080 available to the world outside this container
|
||||
EXPOSE 3080
|
||||
# Expose the server to 0.0.0.0
|
||||
ENV HOST=0.0.0.0
|
||||
# Run the app when the container launches
|
||||
CMD ["npm", "start"]
|
||||
|
||||
# Optional: for client with nginx routing
|
||||
FROM nginx:stable-alpine AS nginx-client
|
||||
WORKDIR /usr/share/nginx/html
|
||||
COPY --from=react-client /client/dist /usr/share/nginx/html
|
||||
# Add your nginx.conf
|
||||
COPY /client/nginx.conf /etc/nginx/conf.d/default.conf
|
||||
COPY client/nginx.conf /etc/nginx/conf.d/default.conf
|
||||
ENTRYPOINT ["nginx", "-g", "daemon off;"]
|
||||
|
||||
@@ -6,10 +6,10 @@ WORKDIR /app
|
||||
# Copy package.json files for client and api
|
||||
COPY /client/package*.json /app/client/
|
||||
COPY /api/package*.json /app/api/
|
||||
COPY /package*.json /app/
|
||||
|
||||
# Install dependencies for both client and api
|
||||
RUN cd /app/client && npm ci
|
||||
RUN cd /app/api && npm ci
|
||||
RUN npm ci
|
||||
|
||||
# Copy the current directory contents into the container
|
||||
COPY /client/ /app/client/
|
||||
|
||||
78
README.md
78
README.md
@@ -40,18 +40,63 @@
|
||||
|
||||
##
|
||||
|
||||
<details open>
|
||||
<summary><strong>2023-05-07</strong></summary>
|
||||
|
||||
**Released [v0.4.0](https://github.com/danny-avila/chatgpt-clone/releases/tag/v0.4.0)**, Introducing User/Auth System and OAuth2/Social Login! You can now register and login with an email account or use Google login. Your your previous conversations and presets will migrate to your new profile upon creation. Check out the details in the [User/Auth System](documents/features/user_auth_system.md) section of the README.md.
|
||||
## **Google's PaLM 2 is now supported as of [v0.4.3](https://github.com/danny-avila/chatgpt-clone/releases/tag/v0.4.3)**
|
||||
|
||||

|
||||
|
||||
⚠️ **IMPORTANT :** You should register and login with a local account (email and password) for the first time sign-up. if you use login for the first time with a social login account (eg. Google, facebook, etc.), the conversations and presets that you created before the user system was implemented will NOT be migrated to that account.
|
||||
<details>
|
||||
<summary><strong>How to Setup PaLM 2 (via Google Cloud Vertex AI API)</strong></summary>
|
||||
- Enable the Vertex AI API on Google Cloud:
|
||||
- - https://console.cloud.google.com/vertex-ai
|
||||
- Create a Service Account:
|
||||
- - https://console.cloud.google.com/projectselector/iam-admin/serviceaccounts/create?walkthrough_id=iam--create-service-account#step_index=1
|
||||
- Make sure to click 'Create and Continue' to give at least the 'Vertex AI User' role.
|
||||
- Create a JSON key, rename as 'auth.json' and save it in /api/data/.
|
||||
|
||||
⚠️ **Breaking - new Env Variables :** You will need to add the new env variables from .env.example for the app to work, even if you're not using multiple users for your purposes.
|
||||
**Alternatively**
|
||||
|
||||
- In your ./api/.env file, set PALM_KEY as "user_provided" to allow the user to provide a Service Account key JSON from the UI.
|
||||
- They will follow the steps above except for renaming the file, simply importing the JSON when prompted.
|
||||
- The key is sent to the server but never saved except in your local storage
|
||||
|
||||
**Note:**
|
||||
|
||||
- Vertex AI does not (yet) support response streaming for text generations, so response may seem to take long when generating a lot of text.
|
||||
- Text streaming is simulated
|
||||
</details>
|
||||
|
||||
---
|
||||
|
||||
<details open>
|
||||
<summary><strong>2023-05-14</strong></summary>
|
||||
|
||||
**Released [v0.4.4](https://github.com/danny-avila/chatgpt-clone/releases/tag/v0.4.4):**
|
||||
|
||||
1. The Msg Clipboard was changed to a checkmark for improved user experience by @techwithanirudh in PR [#247](https://github.com/danny-avila/chatgpt-clone/pull/247).
|
||||
2. A typo in the auth.json path for accessing Google Palm was corrected by @antonme in PR [#266](https://github.com/danny-avila/chatgpt-clone/pull/266).
|
||||
3. @techwithanirudh added a Popup Menu to save sidebar space in PR [#260](https://github.com/danny-avila/chatgpt-clone/pull/260).
|
||||
4. The default pageSize in Conversation.js was increased from 12 to 14 by @danny-avila in PR [#267](https://github.com/danny-avila/chatgpt-clone/pull/267).
|
||||
5. Fonts were updated by @techwithanirudh in PR [#261](https://github.com/danny-avila/chatgpt-clone/pull/261).
|
||||
6. Font file paths in style.css were changed by @danny-avila in PR [#268](https://github.com/danny-avila/chatgpt-clone/pull/268).
|
||||
7. Code was fixed to adjust max_tokens according to model selection by @p4w4n in PR [#263](https://github.com/danny-avila/chatgpt-clone/pull/263).
|
||||
8. Various improvements were made, such as fixing react errors and adjusting the mobile view, by @danny-avila in PR [#269](https://github.com/danny-avila/chatgpt-clone/pull/269).
|
||||
|
||||
New contributors to the project include:
|
||||
|
||||
- @techwithanirudh, who made their first contribution in PR [#247](https://github.com/danny-avila/chatgpt-clone/pull/247).
|
||||
- @antonme, who made their first contribution in PR [#266](https://github.com/danny-avila/chatgpt-clone/pull/266).
|
||||
- @p4w4n, who made their first contribution in PR [#263](https://github.com/danny-avila/chatgpt-clone/pull/263).
|
||||
|
||||
The [full changelog can be found here](https://github.com/danny-avila/chatgpt-clone/compare/v0.4.3...v0.4.4)
|
||||
|
||||
⚠️ **IMPORTANT :** Since V0.4.0 You should register and login with a local account (email and password) for the first time sign-up. if you use login for the first time with a social login account (eg. Google, facebook, etc.), the conversations and presets that you created before the user system was implemented will NOT be migrated to that account.
|
||||
|
||||
⚠️ **Breaking - new Env Variables :** Since V0.4.0 You will need to add the new env variables from .env.example for the app to work, even if you're not using multiple users for your purposes.
|
||||
|
||||
For discussion and suggestion you can join us: **[community discord server](https://discord.gg/NGaa9RPCft)**
|
||||
</details>
|
||||
|
||||
[Past Updates](CHANGELOG.md)
|
||||
##
|
||||
|
||||
<h1>Table of Contents</h1>
|
||||
@@ -69,6 +114,7 @@ For discussion and suggestion you can join us: **[community discord server](http
|
||||
<summary><strong>General Information</strong></summary>
|
||||
|
||||
* [Project Origin](documents/general_info/project_origin.md)
|
||||
* [Multilingual Information](documents/general_info/multilingual_information.md)
|
||||
* [Roadmap](documents/general_info/roadmap.md)
|
||||
* [Tech Stack](documents/general_info/tech_stack.md)
|
||||
* [Changelog](CHANGELOG.md)
|
||||
@@ -94,8 +140,11 @@ For discussion and suggestion you can join us: **[community discord server](http
|
||||
* [Code of Conduct](documents/contributions/code_of_conduct.md)
|
||||
* [Contributor Guidelines](documents/contributions/contributor_guidelines.md)
|
||||
* [Documentation Guidelines](documents/contributions/documentation_guidelines.md)
|
||||
* [Code Standards and Conventions](documents/contributions/coding_conventions.md)
|
||||
* [Testing](documents/contributions/testing.md)
|
||||
* [Pull Request Template](documents/contributions/pull_request_template.md)
|
||||
* [Contributors](CONTRIBUTORS.md)
|
||||
* [Trello Board](https://trello.com/b/17z094kq/chatgpt-clone)
|
||||
</details>
|
||||
|
||||
<details>
|
||||
@@ -107,9 +156,15 @@ For discussion and suggestion you can join us: **[community discord server](http
|
||||
</details>
|
||||
|
||||
##
|
||||
### [Alternative Documentation](https://chatgpt-clone.gitbook.io/chatgpt-clone-docs/get-started/docker)
|
||||
|
||||
## Contributing
|
||||
##
|
||||
|
||||
## Star History
|
||||
|
||||
[](https://star-history.com/#danny-avila/chatgpt-clone&Date)
|
||||
|
||||
## Contributors
|
||||
Contributions and suggestions bug reports and fixes are welcome!
|
||||
Please read the documentation before you do!
|
||||
|
||||
@@ -117,7 +172,8 @@ For new features, components, or extensions, please open an issue and discuss be
|
||||
|
||||
- Join the [Discord community](https://discord.gg/NGaa9RPCft)
|
||||
|
||||
## License
|
||||
This project is licensed under the [MIT License](LICENSE.md).
|
||||
##
|
||||
|
||||
This project exists in its current state thanks to all the people who contribute
|
||||
---
|
||||
<a href="https://github.com/danny-avila/chatgpt-clone/graphs/contributors">
|
||||
<img src="https://contrib.rocks/image?repo=danny-avila/chatgpt-clone" />
|
||||
</a>
|
||||
|
||||
@@ -82,13 +82,43 @@ CHATGPT_TOKEN="user_provided"
|
||||
|
||||
# Identify the available models, separated by commas. The first will be default.
|
||||
# Leave it blank to use internal settings.
|
||||
CHATGPT_MODELS=text-davinci-002-render-sha,text-davinci-002-render-paid,gpt-4
|
||||
CHATGPT_MODELS=text-davinci-002-render-sha,gpt-4
|
||||
# NOTE: you can add gpt-4-plugins, gpt-4-code-interpreter, and gpt-4-browsing to the list above and use the models for these features;
|
||||
# however, the view/display portion of these features are not supported, but you can use the underlying models, which have higher token context
|
||||
# Also: text-davinci-002-render-paid is deprecated as of May 2023
|
||||
|
||||
# Reverse proxy settings for ChatGPT
|
||||
# https://github.com/waylaidwanderer/node-chatgpt-api#using-a-reverse-proxy
|
||||
# By default, the server will use the node-chatgpt-api recommended proxy (a third party server).
|
||||
# CHATGPT_REVERSE_PROXY=
|
||||
|
||||
##########################
|
||||
# PaLM (Google) Endpoint:
|
||||
##########################
|
||||
|
||||
# PaLM 2 Client (via Google Cloud Vertex AI API)
|
||||
# Steps:
|
||||
# Enable the Vertex AI API on Google Cloud:
|
||||
# https://console.cloud.google.com/vertex-ai
|
||||
# Create a Service Account:
|
||||
# https://console.cloud.google.com/projectselector/iam-admin/serviceaccounts/create?walkthrough_id=iam--create-service-account#step_index=1
|
||||
# Make sure to click 'Create and Continue' to give at least the 'Vertex AI User' role.
|
||||
# Create a JSON key, rename as 'auth.json' and save it in /api/data/.
|
||||
# Alternatively
|
||||
# Uncomment below PALM_KEY and set as "user_provided" to allow the user to provide a Service Account key JSON from the UI.
|
||||
# They will follow the steps above except for renaming the file.
|
||||
# Leave blank or omit to disable this endpoint
|
||||
|
||||
# PALM_KEY="user_provided"
|
||||
|
||||
# In case you need a reverse proxy for this endpoint:
|
||||
# GOOGLE_REVERSE_PROXY=
|
||||
|
||||
##########################
|
||||
# Proxy: To be Used by all endpoints
|
||||
##########################
|
||||
PROXY=
|
||||
|
||||
##########################
|
||||
# Search:
|
||||
##########################
|
||||
|
||||
@@ -16,7 +16,7 @@ const askBing = async ({
|
||||
token,
|
||||
onProgress
|
||||
}) => {
|
||||
const { BingAIClient } = await import('og-chatgpt-api');
|
||||
const { BingAIClient } = await import('@waylaidwanderer/chatgpt-api');
|
||||
const store = {
|
||||
store: new KeyvFile({ filename: './data/cache.json' })
|
||||
};
|
||||
|
||||
@@ -11,7 +11,7 @@ const browserClient = async ({
|
||||
abortController,
|
||||
userId
|
||||
}) => {
|
||||
const { ChatGPTBrowserClient } = await import('og-chatgpt-api');
|
||||
const { ChatGPTBrowserClient } = await import('@waylaidwanderer/chatgpt-api');
|
||||
const store = {
|
||||
store: new KeyvFile({ filename: './data/cache.json' })
|
||||
};
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
require('dotenv').config();
|
||||
const { KeyvFile } = require('keyv-file');
|
||||
const { genAzureEndpoint } = require('../../utils/genAzureEndpoints');
|
||||
|
||||
const askClient = async ({
|
||||
text,
|
||||
@@ -22,12 +23,13 @@ const askClient = async ({
|
||||
};
|
||||
|
||||
const azure = process.env.AZURE_OPENAI_API_KEY ? true : false;
|
||||
|
||||
const maxContextTokens = model === 'gpt-4' ? 8191 : model === 'gpt-4-32k' ? 32767 : 4095; // 1 less than maximum
|
||||
const clientOptions = {
|
||||
reverseProxyUrl: process.env.OPENAI_REVERSE_PROXY || null,
|
||||
azure,
|
||||
maxContextTokens,
|
||||
modelOptions: {
|
||||
model: model,
|
||||
model,
|
||||
temperature,
|
||||
top_p,
|
||||
presence_penalty,
|
||||
@@ -36,18 +38,22 @@ const askClient = async ({
|
||||
chatGptLabel,
|
||||
promptPrefix,
|
||||
proxy: process.env.PROXY || null,
|
||||
debug: false
|
||||
// debug: true
|
||||
};
|
||||
|
||||
let apiKey = process.env.OPENAI_KEY;
|
||||
|
||||
if (azure) {
|
||||
apiKey = process.env.AZURE_OPENAI_API_KEY;
|
||||
clientOptions.reverseProxyUrl = `https://${process.env.AZURE_OPENAI_API_INSTANCE_NAME}.openai.azure.com/openai/deployments/${process.env.AZURE_OPENAI_API_DEPLOYMENT_NAME}/chat/completions?api-version=${process.env.AZURE_OPENAI_API_VERSION}`;
|
||||
clientOptions.reverseProxyUrl = genAzureEndpoint({
|
||||
azureOpenAIApiInstanceName: process.env.AZURE_OPENAI_API_INSTANCE_NAME,
|
||||
azureOpenAIApiDeploymentName: process.env.AZURE_OPENAI_API_DEPLOYMENT_NAME,
|
||||
azureOpenAIApiVersion: process.env.AZURE_OPENAI_API_VERSION
|
||||
});
|
||||
}
|
||||
|
||||
const client = new ChatGPTClient(apiKey, clientOptions, store);
|
||||
|
||||
|
||||
const options = {
|
||||
onProgress,
|
||||
abortController,
|
||||
|
||||
390
api/app/google/GoogleClient.js
Normal file
390
api/app/google/GoogleClient.js
Normal file
@@ -0,0 +1,390 @@
|
||||
const crypto = require('crypto');
|
||||
const TextStream = require('../stream');
|
||||
const { google } = require('googleapis');
|
||||
const { Agent, ProxyAgent } = require('undici');
|
||||
const { getMessages, saveMessage, saveConvo } = require('../../models');
|
||||
const { encoding_for_model: encodingForModel, get_encoding: getEncoding } = require('@dqbd/tiktoken');
|
||||
|
||||
const tokenizersCache = {};
|
||||
|
||||
class GoogleAgent {
|
||||
constructor(credentials, options = {}) {
|
||||
this.client_email = credentials.client_email;
|
||||
this.project_id = credentials.project_id;
|
||||
this.private_key = credentials.private_key;
|
||||
this.setOptions(options);
|
||||
this.currentDateString = new Date().toLocaleDateString('en-us', {
|
||||
year: 'numeric',
|
||||
month: 'long',
|
||||
day: 'numeric'
|
||||
});
|
||||
}
|
||||
|
||||
constructUrl() {
|
||||
return `https://us-central1-aiplatform.googleapis.com/v1/projects/${this.project_id}/locations/us-central1/publishers/google/models/${this.modelOptions.model}:predict`;
|
||||
}
|
||||
|
||||
setOptions(options) {
|
||||
if (this.options && !this.options.replaceOptions) {
|
||||
// nested options aren't spread properly, so we need to do this manually
|
||||
this.options.modelOptions = {
|
||||
...this.options.modelOptions,
|
||||
...options.modelOptions
|
||||
};
|
||||
delete options.modelOptions;
|
||||
// now we can merge options
|
||||
this.options = {
|
||||
...this.options,
|
||||
...options
|
||||
};
|
||||
} else {
|
||||
this.options = options;
|
||||
}
|
||||
|
||||
this.options.examples = this.options.examples.filter(
|
||||
obj => obj.input.content !== '' && obj.output.content !== ''
|
||||
);
|
||||
|
||||
const modelOptions = this.options.modelOptions || {};
|
||||
this.modelOptions = {
|
||||
...modelOptions,
|
||||
// set some good defaults (check for undefined in some cases because they may be 0)
|
||||
model: modelOptions.model || 'chat-bison',
|
||||
temperature: typeof modelOptions.temperature === 'undefined' ? 0.2 : modelOptions.temperature, // 0 - 1, 0.2 is recommended
|
||||
topP: typeof modelOptions.topP === 'undefined' ? 0.95 : modelOptions.topP, // 0 - 1, default: 0.95
|
||||
topK: typeof modelOptions.topK === 'undefined' ? 40 : modelOptions.topK // 1-40, default: 40
|
||||
// stop: modelOptions.stop // no stop method for now
|
||||
};
|
||||
|
||||
this.isChatModel = this.modelOptions.model.startsWith('chat-');
|
||||
const { isChatModel } = this;
|
||||
this.isTextModel = this.modelOptions.model.startsWith('text-');
|
||||
const { isTextModel } = this;
|
||||
|
||||
this.maxContextTokens = this.options.maxContextTokens || (isTextModel ? 8000 : 4096);
|
||||
// The max prompt tokens is determined by the max context tokens minus the max response tokens.
|
||||
// Earlier messages will be dropped until the prompt is within the limit.
|
||||
this.maxResponseTokens = this.modelOptions.maxOutputTokens || 1024;
|
||||
this.maxPromptTokens = this.options.maxPromptTokens || this.maxContextTokens - this.maxResponseTokens;
|
||||
|
||||
if (this.maxPromptTokens + this.maxResponseTokens > this.maxContextTokens) {
|
||||
throw new Error(
|
||||
`maxPromptTokens + maxOutputTokens (${this.maxPromptTokens} + ${this.maxResponseTokens} = ${
|
||||
this.maxPromptTokens + this.maxResponseTokens
|
||||
}) must be less than or equal to maxContextTokens (${this.maxContextTokens})`
|
||||
);
|
||||
}
|
||||
|
||||
this.userLabel = this.options.userLabel || 'User';
|
||||
this.modelLabel = this.options.modelLabel || 'Assistant';
|
||||
|
||||
if (isChatModel) {
|
||||
// Use these faux tokens to help the AI understand the context since we are building the chat log ourselves.
|
||||
// Trying to use "<|im_start|>" causes the AI to still generate "<" or "<|" at the end sometimes for some reason,
|
||||
// without tripping the stop sequences, so I'm using "||>" instead.
|
||||
this.startToken = '||>';
|
||||
this.endToken = '';
|
||||
this.gptEncoder = this.constructor.getTokenizer('cl100k_base');
|
||||
} else if (isTextModel) {
|
||||
this.startToken = '<|im_start|>';
|
||||
this.endToken = '<|im_end|>';
|
||||
this.gptEncoder = this.constructor.getTokenizer('text-davinci-003', true, {
|
||||
'<|im_start|>': 100264,
|
||||
'<|im_end|>': 100265
|
||||
});
|
||||
} else {
|
||||
// Previously I was trying to use "<|endoftext|>" but there seems to be some bug with OpenAI's token counting
|
||||
// system that causes only the first "<|endoftext|>" to be counted as 1 token, and the rest are not treated
|
||||
// as a single token. So we're using this instead.
|
||||
this.startToken = '||>';
|
||||
this.endToken = '';
|
||||
try {
|
||||
this.gptEncoder = this.constructor.getTokenizer(this.modelOptions.model, true);
|
||||
} catch {
|
||||
this.gptEncoder = this.constructor.getTokenizer('text-davinci-003', true);
|
||||
}
|
||||
}
|
||||
|
||||
if (!this.modelOptions.stop) {
|
||||
const stopTokens = [this.startToken];
|
||||
if (this.endToken && this.endToken !== this.startToken) {
|
||||
stopTokens.push(this.endToken);
|
||||
}
|
||||
stopTokens.push(`\n${this.userLabel}:`);
|
||||
stopTokens.push('<|diff_marker|>');
|
||||
// I chose not to do one for `modelLabel` because I've never seen it happen
|
||||
this.modelOptions.stop = stopTokens;
|
||||
}
|
||||
|
||||
if (this.options.reverseProxyUrl) {
|
||||
this.completionsUrl = this.options.reverseProxyUrl;
|
||||
} else {
|
||||
this.completionsUrl = this.constructUrl();
|
||||
}
|
||||
|
||||
return this;
|
||||
}
|
||||
|
||||
static getTokenizer(encoding, isModelName = false, extendSpecialTokens = {}) {
|
||||
if (tokenizersCache[encoding]) {
|
||||
return tokenizersCache[encoding];
|
||||
}
|
||||
let tokenizer;
|
||||
if (isModelName) {
|
||||
tokenizer = encodingForModel(encoding, extendSpecialTokens);
|
||||
} else {
|
||||
tokenizer = getEncoding(encoding, extendSpecialTokens);
|
||||
}
|
||||
tokenizersCache[encoding] = tokenizer;
|
||||
return tokenizer;
|
||||
}
|
||||
|
||||
async getClient() {
|
||||
const scopes = ['https://www.googleapis.com/auth/cloud-platform'];
|
||||
const jwtClient = new google.auth.JWT(this.client_email, null, this.private_key, scopes);
|
||||
|
||||
jwtClient.authorize((err) => {
|
||||
if (err) {
|
||||
console.log(err);
|
||||
throw err;
|
||||
}
|
||||
});
|
||||
|
||||
return jwtClient;
|
||||
}
|
||||
|
||||
buildPayload(input, { messages = [] }) {
|
||||
let payload = {
|
||||
instances: [
|
||||
{
|
||||
messages: [...messages, { author: this.userLabel, content: input }]
|
||||
}
|
||||
],
|
||||
parameters: this.options.modelOptions
|
||||
};
|
||||
|
||||
if (this.options.promptPrefix) {
|
||||
payload.instances[0].context = this.options.promptPrefix;
|
||||
}
|
||||
|
||||
if (this.options.examples.length > 0) {
|
||||
payload.instances[0].examples = this.options.examples;
|
||||
}
|
||||
|
||||
if (this.isTextModel) {
|
||||
payload.instances = [
|
||||
{
|
||||
prompt: input
|
||||
}
|
||||
];
|
||||
}
|
||||
|
||||
if (this.options.debug) {
|
||||
console.debug('buildPayload');
|
||||
console.dir(payload, { depth: null });
|
||||
}
|
||||
|
||||
return payload;
|
||||
}
|
||||
|
||||
async getCompletion(input, messages = [], abortController = null) {
|
||||
if (!abortController) {
|
||||
abortController = new AbortController();
|
||||
}
|
||||
const { debug } = this.options;
|
||||
const url = this.completionsUrl;
|
||||
if (debug) {
|
||||
console.debug();
|
||||
console.debug(url);
|
||||
console.debug(this.modelOptions);
|
||||
console.debug();
|
||||
}
|
||||
const opts = {
|
||||
method: 'POST',
|
||||
agent: new Agent({
|
||||
bodyTimeout: 0,
|
||||
headersTimeout: 0
|
||||
}),
|
||||
signal: abortController.signal
|
||||
};
|
||||
|
||||
if (this.options.proxy) {
|
||||
opts.agent = new ProxyAgent(this.options.proxy);
|
||||
}
|
||||
|
||||
const client = await this.getClient();
|
||||
const payload = this.buildPayload(input, { messages });
|
||||
const res = await client.request({ url, method: 'POST', data: payload });
|
||||
console.dir(res.data, { depth: null });
|
||||
return res.data;
|
||||
}
|
||||
|
||||
async loadHistory(conversationId, parentMessageId = null) {
|
||||
if (this.options.debug) {
|
||||
console.debug('Loading history for conversation', conversationId, parentMessageId);
|
||||
}
|
||||
|
||||
if (!parentMessageId) {
|
||||
return [];
|
||||
}
|
||||
|
||||
const messages = (await getMessages({ conversationId })) || [];
|
||||
|
||||
if (messages.length === 0) {
|
||||
this.currentMessages = [];
|
||||
return [];
|
||||
}
|
||||
|
||||
const orderedMessages = this.constructor.getMessagesForConversation(messages, parentMessageId);
|
||||
return orderedMessages.map((message) => {
|
||||
return {
|
||||
author: message.isCreatedByUser ? this.userLabel : this.modelLabel,
|
||||
content: message.content
|
||||
};
|
||||
});
|
||||
}
|
||||
|
||||
async saveMessageToDatabase(message, user = null) {
|
||||
await saveMessage({ ...message, unfinished: false });
|
||||
await saveConvo(user, {
|
||||
conversationId: message.conversationId,
|
||||
endpoint: 'google',
|
||||
...this.modelOptions
|
||||
});
|
||||
}
|
||||
|
||||
async sendMessage(message, opts = {}) {
|
||||
if (opts && typeof opts === 'object') {
|
||||
this.setOptions(opts);
|
||||
}
|
||||
console.log('sendMessage', message, opts);
|
||||
|
||||
const user = opts.user || null;
|
||||
const conversationId = opts.conversationId || crypto.randomUUID();
|
||||
const parentMessageId = opts.parentMessageId || '00000000-0000-0000-0000-000000000000';
|
||||
const userMessageId = crypto.randomUUID();
|
||||
const responseMessageId = crypto.randomUUID();
|
||||
const messages = await this.loadHistory(conversationId, this.options?.parentMessageId);
|
||||
|
||||
const userMessage = {
|
||||
messageId: userMessageId,
|
||||
parentMessageId,
|
||||
conversationId,
|
||||
sender: 'User',
|
||||
text: message,
|
||||
isCreatedByUser: true
|
||||
};
|
||||
|
||||
if (typeof opts?.getIds === 'function') {
|
||||
opts.getIds({
|
||||
userMessage,
|
||||
conversationId,
|
||||
responseMessageId
|
||||
});
|
||||
}
|
||||
|
||||
console.log('userMessage', userMessage);
|
||||
|
||||
await this.saveMessageToDatabase(userMessage, user);
|
||||
let reply = '';
|
||||
let blocked = false;
|
||||
try {
|
||||
const result = await this.getCompletion(message, messages, opts.abortController);
|
||||
blocked = result?.predictions?.[0]?.safetyAttributes?.blocked;
|
||||
reply = result?.predictions?.[0]?.candidates?.[0]?.content || result?.predictions?.[0]?.content || '';
|
||||
if (blocked === true) {
|
||||
reply = `Google blocked a proper response to your message:\n${JSON.stringify(
|
||||
result.predictions[0].safetyAttributes
|
||||
)}${reply.length > 0 ? `\nAI Response:\n${reply}` : ''}`;
|
||||
}
|
||||
if (this.options.debug) {
|
||||
console.debug('result');
|
||||
console.debug(result);
|
||||
}
|
||||
} catch (err) {
|
||||
console.error(err);
|
||||
}
|
||||
|
||||
if (this.options.debug) {
|
||||
console.debug('options');
|
||||
console.debug(this.options);
|
||||
}
|
||||
|
||||
if (!blocked) {
|
||||
const textStream = new TextStream(reply, { delay: 0.5 });
|
||||
await textStream.processTextStream(opts.onProgress);
|
||||
}
|
||||
|
||||
const responseMessage = {
|
||||
messageId: responseMessageId,
|
||||
conversationId,
|
||||
parentMessageId: userMessage.messageId,
|
||||
sender: 'PaLM2',
|
||||
text: reply,
|
||||
error: blocked,
|
||||
isCreatedByUser: false
|
||||
};
|
||||
|
||||
await this.saveMessageToDatabase(responseMessage, user);
|
||||
return responseMessage;
|
||||
}
|
||||
|
||||
getTokenCount(text) {
|
||||
return this.gptEncoder.encode(text, 'all').length;
|
||||
}
|
||||
|
||||
/**
|
||||
* Algorithm adapted from "6. Counting tokens for chat API calls" of
|
||||
* https://github.com/openai/openai-cookbook/blob/main/examples/How_to_count_tokens_with_tiktoken.ipynb
|
||||
*
|
||||
* An additional 2 tokens need to be added for metadata after all messages have been counted.
|
||||
*
|
||||
* @param {*} message
|
||||
*/
|
||||
getTokenCountForMessage(message) {
|
||||
// Map each property of the message to the number of tokens it contains
|
||||
const propertyTokenCounts = Object.entries(message).map(([key, value]) => {
|
||||
// Count the number of tokens in the property value
|
||||
const numTokens = this.getTokenCount(value);
|
||||
|
||||
// Subtract 1 token if the property key is 'name'
|
||||
const adjustment = key === 'name' ? 1 : 0;
|
||||
return numTokens - adjustment;
|
||||
});
|
||||
|
||||
// Sum the number of tokens in all properties and add 4 for metadata
|
||||
return propertyTokenCounts.reduce((a, b) => a + b, 4);
|
||||
}
|
||||
|
||||
/**
|
||||
* Iterate through messages, building an array based on the parentMessageId.
|
||||
* Each message has an id and a parentMessageId. The parentMessageId is the id of the message that this message is a reply to.
|
||||
* @param messages
|
||||
* @param parentMessageId
|
||||
* @returns {*[]} An array containing the messages in the order they should be displayed, starting with the root message.
|
||||
*/
|
||||
static getMessagesForConversation(messages, parentMessageId) {
|
||||
const orderedMessages = [];
|
||||
let currentMessageId = parentMessageId;
|
||||
while (currentMessageId) {
|
||||
// eslint-disable-next-line no-loop-func
|
||||
const message = messages.find(m => m.messageId === currentMessageId);
|
||||
if (!message) {
|
||||
break;
|
||||
}
|
||||
orderedMessages.unshift(message);
|
||||
currentMessageId = message.parentMessageId;
|
||||
}
|
||||
|
||||
if (orderedMessages.length === 0) {
|
||||
return [];
|
||||
}
|
||||
|
||||
return orderedMessages.map(msg => ({
|
||||
isCreatedByUser: msg.isCreatedByUser,
|
||||
content: msg.text
|
||||
}));
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = GoogleAgent;
|
||||
62
api/app/stream.js
Normal file
62
api/app/stream.js
Normal file
@@ -0,0 +1,62 @@
|
||||
const { Readable } = require('stream');
|
||||
|
||||
class TextStream extends Readable {
|
||||
constructor(text, options = {}) {
|
||||
super(options);
|
||||
this.text = text;
|
||||
this.currentIndex = 0;
|
||||
this.delay = options.delay || 20; // Time in milliseconds
|
||||
}
|
||||
|
||||
_read() {
|
||||
const minChunkSize = 2;
|
||||
const maxChunkSize = 4;
|
||||
const { delay } = this;
|
||||
|
||||
if (this.currentIndex < this.text.length) {
|
||||
setTimeout(() => {
|
||||
const remainingChars = this.text.length - this.currentIndex;
|
||||
const chunkSize = Math.min(
|
||||
this.randomInt(minChunkSize, maxChunkSize + 1),
|
||||
remainingChars
|
||||
);
|
||||
|
||||
const chunk = this.text.slice(this.currentIndex, this.currentIndex + chunkSize);
|
||||
this.push(chunk);
|
||||
this.currentIndex += chunkSize;
|
||||
}, delay);
|
||||
} else {
|
||||
this.push(null); // signal end of data
|
||||
}
|
||||
}
|
||||
|
||||
randomInt(min, max) {
|
||||
return Math.floor(Math.random() * (max - min)) + min;
|
||||
}
|
||||
|
||||
async processTextStream(onProgressCallback) {
|
||||
const streamPromise = new Promise((resolve, reject) => {
|
||||
this.on('data', (chunk) => {
|
||||
onProgressCallback(chunk.toString());
|
||||
});
|
||||
|
||||
this.on('end', () => {
|
||||
console.log('Stream ended');
|
||||
resolve();
|
||||
});
|
||||
|
||||
this.on('error', (err) => {
|
||||
reject(err);
|
||||
});
|
||||
});
|
||||
|
||||
try {
|
||||
await streamPromise;
|
||||
} catch (err) {
|
||||
console.error('Error processing text stream:', err);
|
||||
// Handle the error appropriately, e.g., return an error message or throw an error
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = TextStream;
|
||||
@@ -1,7 +1,8 @@
|
||||
const { Configuration, OpenAIApi } = require('openai');
|
||||
const _ = require('lodash');
|
||||
const { genAzureEndpoint } = require('../utils/genAzureEndpoints');
|
||||
|
||||
const proxyEnvToAxiosProxy = proxyString => {
|
||||
const proxyEnvToAxiosProxy = (proxyString) => {
|
||||
if (!proxyString) return null;
|
||||
|
||||
const regex = /^([^:]+):\/\/(?:([^:@]*):?([^:@]*)@)?([^:]+)(?::(\d+))?/;
|
||||
@@ -33,7 +34,9 @@ const titleConvo = async ({ endpoint, text, response }) => {
|
||||
||>Title:`
|
||||
};
|
||||
|
||||
const azure = process.env.AZURE_OPENAI_API_KEY ? true : false;
|
||||
const options = {
|
||||
azure,
|
||||
reverseProxyUrl: process.env.OPENAI_REVERSE_PROXY || null,
|
||||
proxy: process.env.PROXY || null
|
||||
};
|
||||
@@ -47,9 +50,20 @@ const titleConvo = async ({ endpoint, text, response }) => {
|
||||
frequency_penalty: 0
|
||||
};
|
||||
|
||||
const titleGenClient = new ChatGPTClient(process.env.OPENAI_KEY, titleGenClientOptions);
|
||||
let apiKey = process.env.OPENAI_KEY;
|
||||
|
||||
if (azure) {
|
||||
apiKey = process.env.AZURE_OPENAI_API_KEY;
|
||||
titleGenClientOptions.reverseProxyUrl = genAzureEndpoint({
|
||||
azureOpenAIApiInstanceName: process.env.AZURE_OPENAI_API_INSTANCE_NAME,
|
||||
azureOpenAIApiDeploymentName: process.env.AZURE_OPENAI_API_DEPLOYMENT_NAME,
|
||||
azureOpenAIApiVersion: process.env.AZURE_OPENAI_API_VERSION
|
||||
});
|
||||
}
|
||||
|
||||
const titleGenClient = new ChatGPTClient(apiKey, titleGenClientOptions);
|
||||
const result = await titleGenClient.getCompletion([instructionsPayload], null);
|
||||
title = result.choices[0].message.content.replace(/\s+/g, ' ').trim();
|
||||
title = result.choices[0].message.content.replace(/\s+/g, ' ').replaceAll('"', '').trim();
|
||||
} catch (e) {
|
||||
console.error(e);
|
||||
console.log('There was an issue generating title, see error above');
|
||||
|
||||
@@ -2,7 +2,8 @@
|
||||
const regex = / \[.*?]\(.*?\)/g;
|
||||
|
||||
const getCitations = (res) => {
|
||||
const textBlocks = res.details.adaptiveCards[0].body;
|
||||
const adaptiveCards = res.details.adaptiveCards;
|
||||
const textBlocks = adaptiveCards && adaptiveCards[0].body;
|
||||
if (!textBlocks) return '';
|
||||
let links = textBlocks[textBlocks.length - 1]?.text.match(regex);
|
||||
if (links?.length === 0 || !links) return '';
|
||||
|
||||
@@ -30,7 +30,7 @@ module.exports = {
|
||||
return { message: 'Error saving conversation' };
|
||||
}
|
||||
},
|
||||
getConvosByPage: async (user, pageNumber = 1, pageSize = 12) => {
|
||||
getConvosByPage: async (user, pageNumber = 1, pageSize = 14) => {
|
||||
try {
|
||||
const totalConvos = (await Conversation.countDocuments({ user })) || 1;
|
||||
const totalPages = Math.ceil(totalConvos / pageSize);
|
||||
@@ -45,7 +45,7 @@ module.exports = {
|
||||
return { message: 'Error getting conversations' };
|
||||
}
|
||||
},
|
||||
getConvosQueried: async (user, convoIds, pageNumber = 1, pageSize = 12) => {
|
||||
getConvosQueried: async (user, convoIds, pageNumber = 1, pageSize = 14) => {
|
||||
try {
|
||||
if (!convoIds || convoIds.length === 0) {
|
||||
return { conversations: [], pages: 1, pageNumber, pageSize };
|
||||
@@ -57,7 +57,7 @@ module.exports = {
|
||||
// will handle a syncing solution soon
|
||||
const deletedConvoIds = [];
|
||||
|
||||
convoIds.forEach((convo) =>
|
||||
convoIds.forEach(convo =>
|
||||
promises.push(
|
||||
Conversation.findOne({
|
||||
user,
|
||||
@@ -120,7 +120,7 @@ module.exports = {
|
||||
},
|
||||
deleteConvos: async (user, filter) => {
|
||||
let toRemove = await Conversation.find({ ...filter, user }).select('conversationId');
|
||||
const ids = toRemove.map((instance) => instance.conversationId);
|
||||
const ids = toRemove.map(instance => instance.conversationId);
|
||||
let deleteCount = await Conversation.deleteMany({ ...filter, user }).exec();
|
||||
deleteCount.messages = await deleteMessages({ conversationId: { $in: ids } });
|
||||
return deleteCount;
|
||||
|
||||
@@ -17,6 +17,12 @@ module.exports = {
|
||||
default: null,
|
||||
required: false
|
||||
},
|
||||
// for google only
|
||||
modelLabel: {
|
||||
type: String,
|
||||
default: null,
|
||||
required: false
|
||||
},
|
||||
promptPrefix: {
|
||||
type: String,
|
||||
default: null,
|
||||
@@ -32,6 +38,22 @@ module.exports = {
|
||||
default: 1,
|
||||
required: false
|
||||
},
|
||||
// for google only
|
||||
topP: {
|
||||
type: Number,
|
||||
default: 0.95,
|
||||
required: false
|
||||
},
|
||||
topK: {
|
||||
type: Number,
|
||||
default: 40,
|
||||
required: false
|
||||
},
|
||||
maxOutputTokens: {
|
||||
type: Number,
|
||||
default: 1024,
|
||||
required: false
|
||||
},
|
||||
presence_penalty: {
|
||||
type: Number,
|
||||
default: 0,
|
||||
|
||||
@@ -20,6 +20,8 @@ const convoSchema = mongoose.Schema(
|
||||
default: null
|
||||
},
|
||||
messages: [{ type: mongoose.Schema.Types.ObjectId, ref: 'Message' }],
|
||||
// google only
|
||||
examples: [{ type: mongoose.Schema.Types.Mixed }],
|
||||
...conversationPreset,
|
||||
// for bingAI only
|
||||
jailbreakConversationId: {
|
||||
|
||||
@@ -17,6 +17,8 @@ const presetSchema = mongoose.Schema(
|
||||
type: String,
|
||||
default: null
|
||||
},
|
||||
// google only
|
||||
examples: [{ type: mongoose.Schema.Types.Mixed }],
|
||||
...conversationPreset
|
||||
},
|
||||
{ timestamps: true }
|
||||
|
||||
10967
api/package-lock.json
generated
10967
api/package-lock.json
generated
File diff suppressed because it is too large
Load Diff
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "chatgpt-clone",
|
||||
"version": "0.4.1",
|
||||
"name": "chat-backend",
|
||||
"version": "0.4.5",
|
||||
"description": "",
|
||||
"main": "server/index.js",
|
||||
"scripts": {
|
||||
@@ -21,7 +21,7 @@
|
||||
"dependencies": {
|
||||
"@dqbd/tiktoken": "^1.0.2",
|
||||
"@keyv/mongo": "^2.1.8",
|
||||
"@waylaidwanderer/chatgpt-api": "github:danny-avila/node-chatgpt-api",
|
||||
"@waylaidwanderer/chatgpt-api": "^1.36.0",
|
||||
"axios": "^1.3.4",
|
||||
"bcrypt": "^5.1.0",
|
||||
"bcryptjs": "^2.4.3",
|
||||
@@ -32,6 +32,7 @@
|
||||
"dotenv": "^16.0.3",
|
||||
"eslint": "^8.36.0",
|
||||
"express": "^4.18.2",
|
||||
"googleapis": "^118.0.0",
|
||||
"handlebars": "^4.7.7",
|
||||
"html": "^1.0.0",
|
||||
"joi": "^14.3.1",
|
||||
@@ -42,13 +43,13 @@
|
||||
"meilisearch": "^0.31.1",
|
||||
"mongoose": "^6.9.0",
|
||||
"nodemailer": "^6.9.1",
|
||||
"og-chatgpt-api": "npm:@waylaidwanderer/chatgpt-api@^1.35.0",
|
||||
"openai": "^3.1.0",
|
||||
"passport": "^0.6.0",
|
||||
"passport-facebook": "^3.0.0",
|
||||
"passport-google-oauth20": "^2.0.0",
|
||||
"passport-jwt": "^4.0.1",
|
||||
"passport-local": "^1.0.0",
|
||||
"pino": "^8.12.1",
|
||||
"sanitize": "^2.1.2"
|
||||
},
|
||||
"devDependencies": {
|
||||
|
||||
156
api/server/routes/ask/askGoogle.js
Normal file
156
api/server/routes/ask/askGoogle.js
Normal file
@@ -0,0 +1,156 @@
|
||||
const express = require('express');
|
||||
const router = express.Router();
|
||||
const { titleConvo } = require('../../../app/');
|
||||
const GoogleClient = require('../../../app/google/GoogleClient');
|
||||
const { saveMessage, getConvoTitle, saveConvo, getConvo } = require('../../../models');
|
||||
const { handleError, sendMessage, createOnProgress } = require('./handlers');
|
||||
const requireJwtAuth = require('../../../middleware/requireJwtAuth');
|
||||
|
||||
router.post('/', requireJwtAuth, async (req, res) => {
|
||||
const { endpoint, text, parentMessageId, conversationId } = req.body;
|
||||
if (text.length === 0) return handleError(res, { text: 'Prompt empty or too short' });
|
||||
if (endpoint !== 'google') return handleError(res, { text: 'Illegal request' });
|
||||
|
||||
// build endpoint option
|
||||
const endpointOption = {
|
||||
examples: req.body?.examples ?? [{ input: { content: '' }, output: { content: '' } }],
|
||||
promptPrefix: req.body?.promptPrefix ?? null,
|
||||
token: req.body?.token ?? null,
|
||||
modelOptions: {
|
||||
model: req.body?.model ?? 'chat-bison',
|
||||
modelLabel: req.body?.modelLabel ?? null,
|
||||
temperature: req.body?.temperature ?? 0.2,
|
||||
maxOutputTokens: req.body?.maxOutputTokens ?? 1024,
|
||||
topP: req.body?.topP ?? 0.95,
|
||||
topK: req.body?.topK ?? 40
|
||||
}
|
||||
};
|
||||
|
||||
const availableModels = ['chat-bison', 'text-bison'];
|
||||
if (availableModels.find(model => model === endpointOption.modelOptions.model) === undefined) {
|
||||
return handleError(res, { text: `Illegal request: model` });
|
||||
}
|
||||
|
||||
// eslint-disable-next-line no-use-before-define
|
||||
return await ask({
|
||||
text,
|
||||
endpointOption,
|
||||
conversationId,
|
||||
parentMessageId,
|
||||
req,
|
||||
res
|
||||
});
|
||||
});
|
||||
|
||||
const ask = async ({ text, endpointOption, parentMessageId = null, conversationId, req, res }) => {
|
||||
res.writeHead(200, {
|
||||
Connection: 'keep-alive',
|
||||
'Content-Type': 'text/event-stream',
|
||||
'Cache-Control': 'no-cache, no-transform',
|
||||
'Access-Control-Allow-Origin': '*',
|
||||
'X-Accel-Buffering': 'no'
|
||||
});
|
||||
let userMessage;
|
||||
let userMessageId;
|
||||
let responseMessageId;
|
||||
let lastSavedTimestamp = 0;
|
||||
|
||||
try {
|
||||
const getIds = (data) => {
|
||||
userMessage = data.userMessage;
|
||||
userMessageId = userMessage.messageId;
|
||||
responseMessageId = data.responseMessageId;
|
||||
if (!conversationId) {
|
||||
conversationId = data.conversationId;
|
||||
}
|
||||
};
|
||||
|
||||
const { onProgress: progressCallback } = createOnProgress({
|
||||
onProgress: ({ text: partialText }) => {
|
||||
const currentTimestamp = Date.now();
|
||||
if (currentTimestamp - lastSavedTimestamp > 500) {
|
||||
lastSavedTimestamp = currentTimestamp;
|
||||
saveMessage({
|
||||
messageId: responseMessageId,
|
||||
sender: 'PaLM2',
|
||||
conversationId,
|
||||
parentMessageId: userMessageId,
|
||||
text: partialText,
|
||||
unfinished: true,
|
||||
cancelled: false,
|
||||
error: false
|
||||
});
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
const abortController = new AbortController();
|
||||
|
||||
let key;
|
||||
if (endpointOption.token) {
|
||||
key = JSON.parse(endpointOption.token);
|
||||
delete endpointOption.token;
|
||||
console.log('Using service account key provided by User for PaLM models');
|
||||
}
|
||||
|
||||
try {
|
||||
if (!key) {
|
||||
key = require('../../../data/auth.json');
|
||||
}
|
||||
} catch (e) {
|
||||
console.log("No 'auth.json' file (service account key) found in /api/data/ for PaLM models");
|
||||
}
|
||||
|
||||
const clientOptions = {
|
||||
// debug: true, // for testing
|
||||
reverseProxyUrl: process.env.GOOGLE_REVERSE_PROXY || null,
|
||||
proxy: process.env.PROXY || null,
|
||||
...endpointOption
|
||||
};
|
||||
|
||||
const client = new GoogleClient(key, clientOptions);
|
||||
|
||||
let response = await client.sendMessage(text, {
|
||||
getIds,
|
||||
user: req.user.id,
|
||||
parentMessageId,
|
||||
conversationId,
|
||||
onProgress: progressCallback.call(null, { res, text, parentMessageId: userMessageId }),
|
||||
abortController
|
||||
});
|
||||
|
||||
await saveMessage(response);
|
||||
sendMessage(res, {
|
||||
title: await getConvoTitle(req.user.id, conversationId),
|
||||
final: true,
|
||||
conversation: await getConvo(req.user.id, conversationId),
|
||||
requestMessage: userMessage,
|
||||
responseMessage: response
|
||||
});
|
||||
res.end();
|
||||
|
||||
if (parentMessageId == '00000000-0000-0000-0000-000000000000') {
|
||||
const title = await titleConvo({ text, response });
|
||||
await saveConvo(req.user.id, {
|
||||
conversationId: conversationId,
|
||||
title
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
console.error(error);
|
||||
const errorMessage = {
|
||||
messageId: responseMessageId,
|
||||
sender: 'PaLM2',
|
||||
conversationId,
|
||||
parentMessageId,
|
||||
unfinished: false,
|
||||
cancelled: false,
|
||||
error: true,
|
||||
text: error.message
|
||||
};
|
||||
await saveMessage(errorMessage);
|
||||
handleError(res, errorMessage);
|
||||
}
|
||||
};
|
||||
|
||||
module.exports = router;
|
||||
@@ -2,11 +2,13 @@ const express = require('express');
|
||||
const router = express.Router();
|
||||
// const askAzureOpenAI = require('./askAzureOpenAI';)
|
||||
const askOpenAI = require('./askOpenAI');
|
||||
const askGoogle = require('./askGoogle');
|
||||
const askBingAI = require('./askBingAI');
|
||||
const askChatGPTBrowser = require('./askChatGPTBrowser');
|
||||
|
||||
// router.use('/azureOpenAI', askAzureOpenAI);
|
||||
router.use('/openAI', askOpenAI);
|
||||
router.use('/google', askGoogle);
|
||||
router.use('/bingAI', askBingAI);
|
||||
router.use('/chatGPTBrowser', askChatGPTBrowser);
|
||||
|
||||
|
||||
@@ -9,15 +9,39 @@ const getOpenAIModels = () => {
|
||||
};
|
||||
|
||||
const getChatGPTBrowserModels = () => {
|
||||
let models = ['text-davinci-002-render-sha', 'text-davinci-002-render-paid', 'gpt-4'];
|
||||
let models = ['text-davinci-002-render-sha', 'gpt-4'];
|
||||
if (process.env.CHATGPT_MODELS) models = String(process.env.CHATGPT_MODELS).split(',');
|
||||
|
||||
return models;
|
||||
};
|
||||
|
||||
router.get('/', function (req, res) {
|
||||
let i = 0;
|
||||
router.get('/', async function (req, res) {
|
||||
let key, palmUser;
|
||||
try {
|
||||
key = require('../../data/auth.json');
|
||||
} catch (e) {
|
||||
if (i === 0) {
|
||||
console.log("No 'auth.json' file (service account key) found in /api/data/ for PaLM models");
|
||||
i++;
|
||||
}
|
||||
}
|
||||
|
||||
if (process.env.PALM_KEY === 'user_provided') {
|
||||
palmUser = true;
|
||||
if (i <= 1) {
|
||||
console.log('User will provide key for PaLM models');
|
||||
i++;
|
||||
}
|
||||
}
|
||||
|
||||
const google =
|
||||
key || palmUser ? { userProvide: palmUser, availableModels: ['chat-bison', 'text-bison'] } : false;
|
||||
const azureOpenAI = !!process.env.AZURE_OPENAI_KEY;
|
||||
const openAI = process.env.OPENAI_KEY ? { availableModels: getOpenAIModels() } : false;
|
||||
const openAI =
|
||||
process.env.OPENAI_KEY || process.env.AZURE_OPENAI_API_KEY
|
||||
? { availableModels: getOpenAIModels() }
|
||||
: false;
|
||||
const bingAI = process.env.BINGAI_TOKEN
|
||||
? { userProvide: process.env.BINGAI_TOKEN == 'user_provided' }
|
||||
: false;
|
||||
@@ -28,7 +52,7 @@ router.get('/', function (req, res) {
|
||||
}
|
||||
: false;
|
||||
|
||||
res.send(JSON.stringify({ azureOpenAI, openAI, bingAI, chatGPTBrowser }));
|
||||
res.send(JSON.stringify({ azureOpenAI, openAI, google, bingAI, chatGPTBrowser }));
|
||||
});
|
||||
|
||||
module.exports = { router, getOpenAIModels, getChatGPTBrowserModels };
|
||||
|
||||
125
api/utils/LoggingSystem.js
Normal file
125
api/utils/LoggingSystem.js
Normal file
@@ -0,0 +1,125 @@
|
||||
const pino = require('pino');
|
||||
|
||||
const logger = pino({
|
||||
level: 'info',
|
||||
redact: {
|
||||
paths: [ // List of Paths to redact from the logs (https://getpino.io/#/docs/redaction)
|
||||
'env.OPENAI_KEY',
|
||||
'env.BINGAI_TOKEN',
|
||||
'env.CHATGPT_TOKEN',
|
||||
'env.MEILI_MASTER_KEY',
|
||||
'env.GOOGLE_CLIENT_SECRET',
|
||||
'env.JWT_SECRET_DEV',
|
||||
'env.JWT_SECRET_PROD',
|
||||
'newUser.password'], // See example to filter object class instances
|
||||
censor: '***', // Redaction character
|
||||
},
|
||||
});
|
||||
|
||||
// Sanitize outside the logger paths. This is useful for sanitizing variables directly with Regex and patterns.
|
||||
const redactPatterns = [ // Array of regular expressions for redacting patterns
|
||||
/api[-_]?key/i,
|
||||
/password/i,
|
||||
/token/i,
|
||||
/secret/i,
|
||||
/key/i,
|
||||
/certificate/i,
|
||||
/client[-_]?id/i,
|
||||
/authorization[-_]?code/i,
|
||||
/authorization[-_]?login[-_]?hint/i,
|
||||
/authorization[-_]?acr[-_]?values/i,
|
||||
/authorization[-_]?response[-_]?mode/i,
|
||||
/authorization[-_]?nonce/i
|
||||
];
|
||||
|
||||
/*
|
||||
// Example of redacting sensitive data from object class instances
|
||||
function redactSensitiveData(obj) {
|
||||
if (obj instanceof User) {
|
||||
return {
|
||||
...obj.toObject(),
|
||||
password: '***', // Redact the password field
|
||||
};
|
||||
}
|
||||
return obj;
|
||||
}
|
||||
|
||||
// Example of redacting sensitive data from object class instances
|
||||
logger.info({ newUser: redactSensitiveData(newUser) }, 'newUser');
|
||||
*/
|
||||
|
||||
const levels = {
|
||||
TRACE: 10,
|
||||
DEBUG: 20,
|
||||
INFO: 30,
|
||||
WARN: 40,
|
||||
ERROR: 50,
|
||||
FATAL: 60
|
||||
};
|
||||
|
||||
|
||||
|
||||
let level = levels.INFO;
|
||||
|
||||
module.exports = {
|
||||
levels,
|
||||
setLevel: (l) => (level = l),
|
||||
log: {
|
||||
trace: (msg) => {
|
||||
if (level <= levels.TRACE) return;
|
||||
logger.trace(msg);
|
||||
},
|
||||
debug: (msg) => {
|
||||
if (level <= levels.DEBUG) return;
|
||||
logger.debug(msg);
|
||||
},
|
||||
info: (msg) => {
|
||||
if (level <= levels.INFO) return;
|
||||
logger.info(msg);
|
||||
},
|
||||
warn: (msg) => {
|
||||
if (level <= levels.WARN) return;
|
||||
logger.warn(msg);
|
||||
},
|
||||
error: (msg) => {
|
||||
if (level <= levels.ERROR) return;
|
||||
logger.error(msg);
|
||||
},
|
||||
fatal: (msg) => {
|
||||
if (level <= levels.FATAL) return;
|
||||
logger.fatal(msg);
|
||||
},
|
||||
|
||||
// Custom loggers
|
||||
parameters: (parameters) => {
|
||||
if (level <= levels.TRACE) return;
|
||||
logger.debug({ parameters }, 'Function Parameters');
|
||||
},
|
||||
functionName: (name) => {
|
||||
if (level <= levels.TRACE) return;
|
||||
logger.debug(`EXECUTING: ${name}`);
|
||||
},
|
||||
flow: (flow) => {
|
||||
if (level <= levels.INFO) return;
|
||||
logger.debug(`BEGIN FLOW: ${flow}`);
|
||||
},
|
||||
variable: ({ name, value }) => {
|
||||
if (level <= levels.DEBUG) return;
|
||||
// Check if the variable name matches any of the redact patterns and redact the value
|
||||
let sanitizedValue = value;
|
||||
for (const pattern of redactPatterns) {
|
||||
if (pattern.test(name)) {
|
||||
sanitizedValue = '***';
|
||||
break;
|
||||
}
|
||||
}
|
||||
logger.debug({ variable: { name, value: sanitizedValue } }, `VARIABLE ${name}`);
|
||||
},
|
||||
request: () => (req, res, next) => {
|
||||
if (level < levels.DEBUG) return next();
|
||||
logger.debug({ query: req.query, body: req.body }, `Hit URL ${req.url} with following`);
|
||||
return next();
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
5
api/utils/genAzureEndpoints.js
Normal file
5
api/utils/genAzureEndpoints.js
Normal file
@@ -0,0 +1,5 @@
|
||||
function genAzureEndpoint({ azureOpenAIApiInstanceName, azureOpenAIApiDeploymentName, azureOpenAIApiVersion }) {
|
||||
return `https://${azureOpenAIApiInstanceName}.openai.azure.com/openai/deployments/${azureOpenAIApiDeploymentName}/chat/completions?api-version=${azureOpenAIApiVersion}`;
|
||||
}
|
||||
|
||||
module.exports = { genAzureEndpoint };
|
||||
19932
client/package-lock.json
generated
19932
client/package-lock.json
generated
File diff suppressed because it is too large
Load Diff
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "chatgpt-clone",
|
||||
"version": "0.4.1",
|
||||
"name": "chat-frontend",
|
||||
"version": "0.4.5",
|
||||
"description": "",
|
||||
"type": "module",
|
||||
"scripts": {
|
||||
@@ -53,6 +53,7 @@
|
||||
"html2canvas": "^1.4.1",
|
||||
"lodash": "^4.17.21",
|
||||
"lucide-react": "^0.113.0",
|
||||
"pino": "^8.12.1",
|
||||
"rc-input-number": "^7.4.2",
|
||||
"react": "^18.2.0",
|
||||
"react-dom": "^18.2.0",
|
||||
|
||||
BIN
client/public/assets/palm.png
Normal file
BIN
client/public/assets/palm.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 1.9 KiB |
BIN
client/public/fonts/signifier-bold-italic.woff2
Normal file
BIN
client/public/fonts/signifier-bold-italic.woff2
Normal file
Binary file not shown.
BIN
client/public/fonts/signifier-bold.woff2
Normal file
BIN
client/public/fonts/signifier-bold.woff2
Normal file
Binary file not shown.
BIN
client/public/fonts/signifier-light-italic.woff2
Normal file
BIN
client/public/fonts/signifier-light-italic.woff2
Normal file
Binary file not shown.
BIN
client/public/fonts/signifier-light.woff2
Normal file
BIN
client/public/fonts/signifier-light.woff2
Normal file
Binary file not shown.
BIN
client/public/fonts/soehne-buch-kursiv.woff2
Normal file
BIN
client/public/fonts/soehne-buch-kursiv.woff2
Normal file
Binary file not shown.
BIN
client/public/fonts/soehne-buch.woff2
Normal file
BIN
client/public/fonts/soehne-buch.woff2
Normal file
Binary file not shown.
BIN
client/public/fonts/soehne-halbfett-kursiv.woff2
Normal file
BIN
client/public/fonts/soehne-halbfett-kursiv.woff2
Normal file
Binary file not shown.
BIN
client/public/fonts/soehne-halbfett.woff2
Normal file
BIN
client/public/fonts/soehne-halbfett.woff2
Normal file
Binary file not shown.
BIN
client/public/fonts/soehne-kraftig-kursiv.woff2
Normal file
BIN
client/public/fonts/soehne-kraftig-kursiv.woff2
Normal file
Binary file not shown.
BIN
client/public/fonts/soehne-kraftig.woff2
Normal file
BIN
client/public/fonts/soehne-kraftig.woff2
Normal file
Binary file not shown.
BIN
client/public/fonts/soehne-mono-buch-kursiv.woff2
Normal file
BIN
client/public/fonts/soehne-mono-buch-kursiv.woff2
Normal file
Binary file not shown.
BIN
client/public/fonts/soehne-mono-buch.woff2
Normal file
BIN
client/public/fonts/soehne-mono-buch.woff2
Normal file
Binary file not shown.
BIN
client/public/fonts/soehne-mono-halbfett.woff2
Normal file
BIN
client/public/fonts/soehne-mono-halbfett.woff2
Normal file
Binary file not shown.
@@ -126,7 +126,7 @@ export default function Conversation({ conversation, retainView }) {
|
||||
/>
|
||||
</div>
|
||||
) : (
|
||||
<div className="absolute inset-y-0 right-0 z-10 w-8 bg-gradient-to-l from-gray-900 group-hover:from-[#2A2B32]" />
|
||||
<div className="absolute inset-y-0 right-0 z-10 w-8 bg-gradient-to-l from-gray-900 group-hover:from-[#2A2B32] rounded-r-md" />
|
||||
)}
|
||||
</a>
|
||||
);
|
||||
|
||||
@@ -40,7 +40,7 @@ function Settings(props) {
|
||||
|
||||
|
||||
return (
|
||||
<>
|
||||
<div className="max-h-[350px] overflow-y-auto">
|
||||
<div className="grid gap-6 sm:grid-cols-2">
|
||||
<div className="col-span-1 flex flex-col items-center justify-start gap-6">
|
||||
<div className="grid w-full items-center gap-2">
|
||||
@@ -141,7 +141,7 @@ function Settings(props) {
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
</>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
|
||||
@@ -1,4 +1,6 @@
|
||||
import React, { useEffect, useState } from 'react';
|
||||
import Examples from './Google/Examples.jsx';
|
||||
import MessagesSquared from '~/components/svg/MessagesSquared.jsx';
|
||||
import { useSetRecoilState, useRecoilValue } from 'recoil';
|
||||
import filenamify from 'filenamify';
|
||||
import axios from 'axios';
|
||||
@@ -7,6 +9,7 @@ import DialogTemplate from '../ui/DialogTemplate';
|
||||
import { Dialog, DialogClose, DialogButton } from '../ui/Dialog.tsx';
|
||||
import { Input } from '../ui/Input.tsx';
|
||||
import { Label } from '../ui/Label.tsx';
|
||||
import { Button } from '../ui/Button.tsx';
|
||||
import Dropdown from '../ui/Dropdown';
|
||||
import { cn } from '~/utils/';
|
||||
import cleanupPreset from '~/utils/cleanupPreset';
|
||||
@@ -18,11 +21,14 @@ import store from '~/store';
|
||||
const EditPresetDialog = ({ open, onOpenChange, preset: _preset, title }) => {
|
||||
// const [title, setTitle] = useState('My Preset');
|
||||
const [preset, setPreset] = useState(_preset);
|
||||
const [showExamples, setShowExamples] = useState(false);
|
||||
const setPresets = useSetRecoilState(store.presets);
|
||||
|
||||
const availableEndpoints = useRecoilValue(store.availableEndpoints);
|
||||
const endpointsConfig = useRecoilValue(store.endpointsConfig);
|
||||
|
||||
const triggerExamples = () => setShowExamples(prev => !prev);
|
||||
|
||||
const setOption = param => newValue => {
|
||||
let update = {};
|
||||
update[param] = newValue;
|
||||
@@ -37,6 +43,69 @@ const EditPresetDialog = ({ open, onOpenChange, preset: _preset, title }) => {
|
||||
);
|
||||
};
|
||||
|
||||
const setExample = (i, type, newValue = null) => {
|
||||
let update = {};
|
||||
let current = preset?.examples.slice() || [];
|
||||
let currentExample = { ...current[i] } || {};
|
||||
currentExample[type] = { content: newValue };
|
||||
current[i] = currentExample;
|
||||
update.examples = current;
|
||||
setPreset(prevState =>
|
||||
cleanupPreset({
|
||||
preset: {
|
||||
...prevState,
|
||||
...update
|
||||
},
|
||||
endpointsConfig
|
||||
})
|
||||
);
|
||||
};
|
||||
|
||||
const addExample = () => {
|
||||
let update = {};
|
||||
let current = preset?.examples.slice() || [];
|
||||
current.push({ input: { content: '' }, output: { content: '' } });
|
||||
update.examples = current;
|
||||
setPreset(prevState =>
|
||||
cleanupPreset({
|
||||
preset: {
|
||||
...prevState,
|
||||
...update
|
||||
},
|
||||
endpointsConfig
|
||||
})
|
||||
);
|
||||
};
|
||||
|
||||
const removeExample = () => {
|
||||
let update = {};
|
||||
let current = preset?.examples.slice() || [];
|
||||
if (current.length <= 1) {
|
||||
update.examples = [{ input: { content: '' }, output: { content: '' } }];
|
||||
setPreset(prevState =>
|
||||
cleanupPreset({
|
||||
preset: {
|
||||
...prevState,
|
||||
...update
|
||||
},
|
||||
endpointsConfig
|
||||
})
|
||||
);
|
||||
return;
|
||||
}
|
||||
current.pop();
|
||||
update.examples = current;
|
||||
setPreset(prevState =>
|
||||
cleanupPreset({
|
||||
preset: {
|
||||
...prevState,
|
||||
...update
|
||||
},
|
||||
endpointsConfig
|
||||
})
|
||||
);
|
||||
};
|
||||
|
||||
const defaultTextProps =
|
||||
'rounded-md border border-gray-200 focus:border-slate-400 focus:bg-gray-50 bg-transparent text-sm shadow-[0_0_10px_rgba(0,0,0,0.05)] outline-none placeholder:text-gray-400 focus:outline-none focus:ring-gray-400 focus:ring-opacity-20 focus:ring-offset-2 disabled:cursor-not-allowed disabled:opacity-50 dark:border-gray-500 dark:bg-gray-700 focus:dark:bg-gray-600 dark:text-gray-50 dark:shadow-[0_0_15px_rgba(0,0,0,0.10)] dark:focus:border-gray-400 dark:focus:outline-none dark:focus:ring-0 dark:focus:ring-gray-400 dark:focus:ring-offset-0';
|
||||
|
||||
@@ -111,14 +180,35 @@ const EditPresetDialog = ({ open, onOpenChange, preset: _preset, title }) => {
|
||||
)}
|
||||
containerClassName="flex w-full resize-none"
|
||||
/>
|
||||
{preset?.endpoint === 'google' && (
|
||||
<Button
|
||||
type="button"
|
||||
className="ml-1 flex h-auto w-full bg-transparent px-2 py-1 text-xs font-medium font-normal text-black hover:bg-slate-200 hover:text-black focus:ring-0 focus:ring-offset-0 dark:bg-transparent dark:text-white dark:hover:bg-gray-700 dark:hover:text-white dark:focus:outline-none dark:focus:ring-offset-0"
|
||||
onClick={triggerExamples}
|
||||
>
|
||||
<MessagesSquared className="mr-1 w-[14px]" />
|
||||
{(showExamples ? 'Hide' : 'Show') + ' Examples'}
|
||||
</Button>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
<div className="my-4 w-full border-t border-gray-300 dark:border-gray-500" />
|
||||
<div className="w-full p-0">
|
||||
<Settings
|
||||
preset={preset}
|
||||
setOption={setOption}
|
||||
/>
|
||||
{((preset?.endpoint === 'google' && !showExamples) || preset?.endpoint !== 'google') && (
|
||||
<Settings
|
||||
preset={_preset}
|
||||
setOption={setOption}
|
||||
/>
|
||||
)}
|
||||
{preset?.endpoint === 'google' && showExamples && (
|
||||
<Examples
|
||||
examples={preset.examples}
|
||||
setExample={setExample}
|
||||
addExample={addExample}
|
||||
removeExample={removeExample}
|
||||
edit={true}
|
||||
/>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
}
|
||||
|
||||
@@ -12,12 +12,16 @@ import store from '~/store';
|
||||
|
||||
// A preset dialog to show readonly preset values.
|
||||
const EndpointOptionsDialog = ({ open, onOpenChange, preset: _preset, title }) => {
|
||||
// const [title, setTitle] = useState('My Preset');
|
||||
const [preset, setPreset] = useState(_preset);
|
||||
const [endpointName, setEndpointName] = useState(preset?.endpoint);
|
||||
|
||||
const [saveAsDialogShow, setSaveAsDialogShow] = useState(false);
|
||||
const endpointsConfig = useRecoilValue(store.endpointsConfig);
|
||||
|
||||
if (endpointName === 'google') {
|
||||
setEndpointName('PaLM');
|
||||
}
|
||||
|
||||
const setOption = param => newValue => {
|
||||
let update = {};
|
||||
update[param] = newValue;
|
||||
@@ -50,7 +54,7 @@ const EndpointOptionsDialog = ({ open, onOpenChange, preset: _preset, title }) =
|
||||
onOpenChange={onOpenChange}
|
||||
>
|
||||
<DialogTemplate
|
||||
title={`${title || 'View Options'} - ${preset?.endpoint}`}
|
||||
title={`${title || 'View Options'} - ${endpointName}`}
|
||||
className="max-w-full sm:max-w-4xl"
|
||||
main={
|
||||
<div className="flex w-full flex-col items-center gap-2">
|
||||
|
||||
@@ -4,7 +4,13 @@ import CrossIcon from '../svg/CrossIcon';
|
||||
// import SaveIcon from '../svg/SaveIcon';
|
||||
import { Save } from 'lucide-react';
|
||||
|
||||
function EndpointOptionsPopover({ content, visible, saveAsPreset, switchToSimpleMode }) {
|
||||
function EndpointOptionsPopover({
|
||||
content,
|
||||
visible,
|
||||
saveAsPreset,
|
||||
switchToSimpleMode,
|
||||
additionalButton = null
|
||||
}) {
|
||||
const cardStyle =
|
||||
'shadow-md rounded-md min-w-[75px] font-normal bg-white border-black/10 border dark:bg-gray-700 text-black dark:text-white';
|
||||
|
||||
@@ -12,29 +18,39 @@ function EndpointOptionsPopover({ content, visible, saveAsPreset, switchToSimple
|
||||
<>
|
||||
<div
|
||||
className={
|
||||
' endpointOptionsPopover-container absolute bottom-[-10px] flex w-full flex-col items-center justify-center md:px-4' +
|
||||
' endpointOptionsPopover-container absolute bottom-[-10px] flex w-full flex-col items-center md:px-4' +
|
||||
(visible ? ' show' : '')
|
||||
}
|
||||
>
|
||||
<div
|
||||
className={
|
||||
cardStyle +
|
||||
' border-s-0 border-d-0 flex w-full flex-col overflow-hidden rounded-none border-t bg-slate-200 px-0 pb-[10px] dark:border-white/10 md:rounded-md md:border lg:w-[736px]'
|
||||
' border-d-0 flex w-full flex-col overflow-hidden rounded-none border-s-0 border-t bg-slate-200 px-0 pb-[10px] dark:border-white/10 md:rounded-md md:border lg:w-[736px]'
|
||||
}
|
||||
>
|
||||
<div className="flex w-full items-center justify-between bg-slate-100 px-2 py-2 dark:bg-gray-800/60">
|
||||
<div className="flex w-full items-center bg-slate-100 px-2 py-2 dark:bg-gray-800/60">
|
||||
{/* <span className="text-xs font-medium font-normal">Advanced settings for OpenAI endpoint</span> */}
|
||||
<Button
|
||||
type="button"
|
||||
className="h-auto bg-transparent px-2 py-1 text-xs font-medium font-normal text-black hover:bg-slate-200 hover:text-black dark:bg-transparent dark:text-white dark:hover:bg-gray-700 dark:hover:text-white"
|
||||
className="h-auto justify-start bg-transparent px-2 py-1 text-xs font-medium font-normal text-black hover:bg-slate-200 hover:text-black focus:ring-0 dark:bg-transparent dark:text-white dark:hover:bg-gray-700 dark:hover:text-white dark:focus:outline-none dark:focus:ring-offset-0"
|
||||
onClick={saveAsPreset}
|
||||
>
|
||||
<Save className="mr-1 w-[14px]" />
|
||||
Save as preset
|
||||
</Button>
|
||||
{additionalButton && (
|
||||
<Button
|
||||
type="button"
|
||||
className="ml-1 h-auto justify-start bg-transparent px-2 py-1 text-xs font-medium font-normal text-black hover:bg-slate-200 hover:text-black focus:ring-0 focus:ring-offset-0 dark:bg-transparent dark:text-white dark:hover:bg-gray-700 dark:hover:text-white dark:focus:outline-none dark:focus:ring-offset-0"
|
||||
onClick={additionalButton.handler}
|
||||
>
|
||||
{additionalButton.icon}
|
||||
{additionalButton.label}
|
||||
</Button>
|
||||
)}
|
||||
<Button
|
||||
type="button"
|
||||
className="h-auto bg-transparent px-2 py-1 text-xs font-medium font-normal text-black hover:bg-slate-200 hover:text-black dark:bg-transparent dark:text-white dark:hover:bg-gray-700 dark:hover:text-white"
|
||||
className="ml-auto h-auto bg-transparent px-2 py-1 text-xs font-medium font-normal text-black hover:bg-slate-200 hover:text-black focus:ring-offset-0 dark:bg-transparent dark:text-white dark:hover:bg-gray-700 dark:hover:text-white"
|
||||
onClick={switchToSimpleMode}
|
||||
>
|
||||
<CrossIcon className="mr-1" />
|
||||
|
||||
98
client/src/components/Endpoints/Google/Examples.jsx
Normal file
98
client/src/components/Endpoints/Google/Examples.jsx
Normal file
@@ -0,0 +1,98 @@
|
||||
import React from 'react';
|
||||
import TextareaAutosize from 'react-textarea-autosize';
|
||||
import { Button } from '~/components/ui/Button.tsx';
|
||||
import { Label } from '~/components/ui/Label.tsx';
|
||||
import { Plus, Minus } from 'lucide-react';
|
||||
import { cn } from '~/utils/';
|
||||
const defaultTextProps =
|
||||
'rounded-md border border-gray-200 focus:border-slate-400 focus:bg-gray-50 bg-transparent text-sm shadow-[0_0_10px_rgba(0,0,0,0.05)] outline-none placeholder:text-gray-400 focus:outline-none focus:ring-gray-400 focus:ring-opacity-20 focus:ring-offset-2 disabled:cursor-not-allowed disabled:opacity-50 dark:border-gray-500 dark:bg-gray-700 focus:dark:bg-gray-600 dark:text-gray-50 dark:shadow-[0_0_15px_rgba(0,0,0,0.10)] dark:focus:border-gray-400 dark:focus:outline-none dark:focus:ring-0 dark:focus:ring-gray-400 dark:focus:ring-offset-0';
|
||||
|
||||
function Examples({ readonly, examples, setExample, addExample, removeExample, edit = false }) {
|
||||
const maxHeight = edit ? 'max-h-[233px]' : 'max-h-[350px]';
|
||||
return (
|
||||
<>
|
||||
<div className={`${maxHeight} overflow-y-auto`}>
|
||||
<div
|
||||
id="examples-grid"
|
||||
className="grid gap-6 sm:grid-cols-2"
|
||||
>
|
||||
{examples.map((example, idx) => (
|
||||
<React.Fragment key={idx}>
|
||||
{/* Input */}
|
||||
<div
|
||||
className={`col-span-${
|
||||
examples.length === 1 ? '1' : 'full'
|
||||
} flex flex-col items-center justify-start gap-6 sm:col-span-1`}
|
||||
>
|
||||
<div className="grid w-full items-center gap-2">
|
||||
<Label
|
||||
htmlFor={`input-${idx}`}
|
||||
className="text-left text-sm font-medium"
|
||||
>
|
||||
Input <small className="opacity-40">(default: blank)</small>
|
||||
</Label>
|
||||
<TextareaAutosize
|
||||
id={`input-${idx}`}
|
||||
disabled={readonly}
|
||||
value={example?.input?.content || ''}
|
||||
onChange={e => setExample(idx, 'input', e.target.value || null)}
|
||||
placeholder="Set example input. Example is ignored if empty."
|
||||
className={cn(
|
||||
defaultTextProps,
|
||||
'flex max-h-[300px] min-h-[75px] w-full resize-none px-3 py-2 '
|
||||
)}
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Output */}
|
||||
<div
|
||||
className={`col-span-${
|
||||
examples.length === 1 ? '1' : 'full'
|
||||
} flex flex-col items-center justify-start gap-6 sm:col-span-1`}
|
||||
>
|
||||
<div className="grid w-full items-center gap-2">
|
||||
<Label
|
||||
htmlFor={`output-${idx}`}
|
||||
className="text-left text-sm font-medium"
|
||||
>
|
||||
Output <small className="opacity-40">(default: blank)</small>
|
||||
</Label>
|
||||
<TextareaAutosize
|
||||
id={`output-${idx}`}
|
||||
disabled={readonly}
|
||||
value={example?.output?.content || ''}
|
||||
onChange={e => setExample(idx, 'output', e.target.value || null)}
|
||||
placeholder={`Set example output. Example is ignored if empty.`}
|
||||
className={cn(
|
||||
defaultTextProps,
|
||||
'flex max-h-[300px] min-h-[75px] w-full resize-none px-3 py-2 '
|
||||
)}
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
</React.Fragment>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
<div className="flex justify-center">
|
||||
<Button
|
||||
type="button"
|
||||
className="mr-2 mt-1 h-auto items-center justify-center bg-transparent px-3 py-2 text-xs font-medium font-normal text-black hover:bg-slate-200 hover:text-black focus:ring-0 focus:ring-offset-0 dark:bg-transparent dark:text-white dark:hover:bg-gray-600 dark:hover:text-white dark:focus:outline-none dark:focus:ring-offset-0"
|
||||
onClick={removeExample}
|
||||
>
|
||||
<Minus className="w-[16px]" />
|
||||
</Button>
|
||||
<Button
|
||||
type="button"
|
||||
className="mt-1 h-auto items-center justify-center bg-transparent px-3 py-2 text-xs font-medium font-normal text-black hover:bg-slate-200 hover:text-black focus:ring-0 focus:ring-offset-0 dark:bg-transparent dark:text-white dark:hover:bg-gray-600 dark:hover:text-white dark:focus:outline-none dark:focus:ring-offset-0"
|
||||
onClick={addExample}
|
||||
>
|
||||
<Plus className="w-[16px]" />
|
||||
</Button>
|
||||
</div>
|
||||
</>
|
||||
);
|
||||
}
|
||||
|
||||
export default Examples;
|
||||
32
client/src/components/Endpoints/Google/OptionHover.jsx
Normal file
32
client/src/components/Endpoints/Google/OptionHover.jsx
Normal file
@@ -0,0 +1,32 @@
|
||||
import React from 'react';
|
||||
import { HoverCardPortal, HoverCardContent } from '~/components/ui/HoverCard.tsx';
|
||||
|
||||
const types = {
|
||||
temp: 'Higher values = more random, while lower values = more focused and deterministic. We recommend altering this or Top P but not both.',
|
||||
topp: 'Top-p changes how the model selects tokens for output. Tokens are selected from most K (see topK parameter) probable to least until the sum of their probabilities equals the top-p value.',
|
||||
topk: "Top-k changes how the model selects tokens for output. A top-k of 1 means the selected token is the most probable among all tokens in the model's vocabulary (also called greedy decoding), while a top-k of 3 means that the next token is selected from among the 3 most probable tokens (using temperature).",
|
||||
maxoutputtokens: " Maximum number of tokens that can be generated in the response. Specify a lower value for shorter responses and a higher value for longer responses."
|
||||
};
|
||||
|
||||
function OptionHover({ type, side }) {
|
||||
// const options = {};
|
||||
// if (type === 'pres') {
|
||||
// options.sideOffset = 45;
|
||||
// }
|
||||
|
||||
return (
|
||||
<HoverCardPortal>
|
||||
<HoverCardContent
|
||||
side={side}
|
||||
className="w-80 "
|
||||
// {...options}
|
||||
>
|
||||
<div className="space-y-2">
|
||||
<p className="text-sm text-gray-600 dark:text-gray-300">{types[type]}</p>
|
||||
</div>
|
||||
</HoverCardContent>
|
||||
</HoverCardPortal>
|
||||
);
|
||||
}
|
||||
|
||||
export default OptionHover;
|
||||
271
client/src/components/Endpoints/Google/Settings.jsx
Normal file
271
client/src/components/Endpoints/Google/Settings.jsx
Normal file
@@ -0,0 +1,271 @@
|
||||
import { useRecoilValue } from 'recoil';
|
||||
import TextareaAutosize from 'react-textarea-autosize';
|
||||
import SelectDropDown from '../../ui/SelectDropDown';
|
||||
import { Input } from '~/components/ui/Input.tsx';
|
||||
import { Label } from '~/components/ui/Label.tsx';
|
||||
import { Slider } from '~/components/ui/Slider.tsx';
|
||||
import { InputNumber } from '~/components/ui/InputNumber.tsx';
|
||||
import OptionHover from './OptionHover';
|
||||
import { HoverCard, HoverCardTrigger } from '~/components/ui/HoverCard.tsx';
|
||||
import { cn } from '~/utils/';
|
||||
const defaultTextProps =
|
||||
'rounded-md border border-gray-200 focus:border-slate-400 focus:bg-gray-50 bg-transparent text-sm shadow-[0_0_10px_rgba(0,0,0,0.05)] outline-none placeholder:text-gray-400 focus:outline-none focus:ring-gray-400 focus:ring-opacity-20 focus:ring-offset-2 disabled:cursor-not-allowed disabled:opacity-50 dark:border-gray-500 dark:bg-gray-700 focus:dark:bg-gray-600 dark:text-gray-50 dark:shadow-[0_0_15px_rgba(0,0,0,0.10)] dark:focus:border-gray-400 dark:focus:outline-none dark:focus:ring-0 dark:focus:ring-gray-400 dark:focus:ring-offset-0';
|
||||
|
||||
const optionText =
|
||||
'p-0 shadow-none text-right pr-1 h-8 border-transparent focus:ring-[#10a37f] focus:ring-offset-0 focus:ring-opacity-100 hover:bg-gray-800/10 dark:hover:bg-white/10 focus:bg-gray-800/10 dark:focus:bg-white/10 transition-colors';
|
||||
|
||||
import store from '~/store';
|
||||
|
||||
function Settings(props) {
|
||||
const { readonly, model, modelLabel, promptPrefix, temperature, topP, topK, maxOutputTokens, setOption, edit = false } = props;
|
||||
const maxHeight = edit ? 'max-h-[233px]' : 'max-h-[350px]';
|
||||
const endpointsConfig = useRecoilValue(store.endpointsConfig);
|
||||
|
||||
const setModel = setOption('model');
|
||||
const setModelLabel = setOption('modelLabel');
|
||||
const setPromptPrefix = setOption('promptPrefix');
|
||||
const setTemperature = setOption('temperature');
|
||||
const setTopP = setOption('topP');
|
||||
const setTopK = setOption('topK');
|
||||
const setMaxOutputTokens = setOption('maxOutputTokens');
|
||||
|
||||
const models = endpointsConfig?.['google']?.['availableModels'] || [];
|
||||
|
||||
return (
|
||||
<div className={`${maxHeight} overflow-y-auto`}>
|
||||
<div className="grid gap-6 sm:grid-cols-2">
|
||||
<div className="col-span-1 flex flex-col items-center justify-start gap-6">
|
||||
<div className="grid w-full items-center gap-2">
|
||||
<SelectDropDown
|
||||
value={model}
|
||||
setValue={setModel}
|
||||
availableValues={models}
|
||||
disabled={readonly}
|
||||
className={cn(
|
||||
defaultTextProps,
|
||||
'flex w-full resize-none focus:outline-none focus:ring-0 focus:ring-opacity-0 focus:ring-offset-0'
|
||||
)}
|
||||
containerClassName="flex w-full resize-none"
|
||||
/>
|
||||
</div>
|
||||
<div className="grid w-full items-center gap-2">
|
||||
<Label
|
||||
htmlFor="modelLabel"
|
||||
className="text-left text-sm font-medium"
|
||||
>
|
||||
Custom Name <small className="opacity-40">(default: blank)</small>
|
||||
</Label>
|
||||
<Input
|
||||
id="modelLabel"
|
||||
disabled={readonly}
|
||||
value={modelLabel || ''}
|
||||
onChange={e => setModelLabel(e.target.value || null)}
|
||||
placeholder="Set a custom name for PaLM2"
|
||||
className={cn(
|
||||
defaultTextProps,
|
||||
'flex h-10 max-h-10 w-full resize-none px-3 py-2 focus:outline-none focus:ring-0 focus:ring-opacity-0 focus:ring-offset-0'
|
||||
)}
|
||||
/>
|
||||
</div>
|
||||
<div className="grid w-full items-center gap-2">
|
||||
<Label
|
||||
htmlFor="promptPrefix"
|
||||
className="text-left text-sm font-medium"
|
||||
>
|
||||
Prompt Prefix <small className="opacity-40">(default: blank)</small>
|
||||
</Label>
|
||||
<TextareaAutosize
|
||||
id="promptPrefix"
|
||||
disabled={readonly}
|
||||
value={promptPrefix || ''}
|
||||
onChange={e => setPromptPrefix(e.target.value || null)}
|
||||
placeholder="Set custom instructions or context. Ignored if empty."
|
||||
className={cn(
|
||||
defaultTextProps,
|
||||
'flex max-h-[300px] min-h-[100px] w-full resize-none px-3 py-2 '
|
||||
)}
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
<div className="col-span-1 flex flex-col items-center justify-start gap-6">
|
||||
<HoverCard openDelay={300}>
|
||||
<HoverCardTrigger className="grid w-full items-center gap-2">
|
||||
<div className="flex justify-between">
|
||||
<Label
|
||||
htmlFor="temp-int"
|
||||
className="text-left text-sm font-medium"
|
||||
>
|
||||
Temperature <small className="opacity-40">(default: 0.2)</small>
|
||||
</Label>
|
||||
<InputNumber
|
||||
id="temp-int"
|
||||
disabled={readonly}
|
||||
value={temperature}
|
||||
onChange={value => setTemperature(value)}
|
||||
max={1}
|
||||
min={0}
|
||||
step={0.01}
|
||||
controls={false}
|
||||
className={cn(
|
||||
defaultTextProps,
|
||||
cn(
|
||||
optionText,
|
||||
'reset-rc-number-input reset-rc-number-input-text-right h-auto w-12 border-0 group-hover/temp:border-gray-200'
|
||||
)
|
||||
)}
|
||||
/>
|
||||
</div>
|
||||
<Slider
|
||||
disabled={readonly}
|
||||
value={[temperature]}
|
||||
onValueChange={value => setTemperature(value[0])}
|
||||
doubleClickHandler={() => setTemperature(1)}
|
||||
max={1}
|
||||
min={0}
|
||||
step={0.01}
|
||||
className="flex h-4 w-full"
|
||||
/>
|
||||
</HoverCardTrigger>
|
||||
<OptionHover
|
||||
type="temp"
|
||||
side="left"
|
||||
/>
|
||||
</HoverCard>
|
||||
<HoverCard openDelay={300}>
|
||||
<HoverCardTrigger className="grid w-full items-center gap-2">
|
||||
<div className="flex justify-between">
|
||||
<Label
|
||||
htmlFor="top-p-int"
|
||||
className="text-left text-sm font-medium"
|
||||
>
|
||||
Top P <small className="opacity-40">(default: 0.95)</small>
|
||||
</Label>
|
||||
<InputNumber
|
||||
id="top-p-int"
|
||||
disabled={readonly}
|
||||
value={topP}
|
||||
onChange={value => setTopP(value)}
|
||||
max={1}
|
||||
min={0}
|
||||
step={0.01}
|
||||
controls={false}
|
||||
className={cn(
|
||||
defaultTextProps,
|
||||
cn(
|
||||
optionText,
|
||||
'reset-rc-number-input reset-rc-number-input-text-right h-auto w-12 border-0 group-hover/temp:border-gray-200'
|
||||
)
|
||||
)}
|
||||
/>
|
||||
</div>
|
||||
<Slider
|
||||
disabled={readonly}
|
||||
value={[topP]}
|
||||
onValueChange={value => setTopP(value[0])}
|
||||
doubleClickHandler={() => setTopP(1)}
|
||||
max={1}
|
||||
min={0}
|
||||
step={0.01}
|
||||
className="flex h-4 w-full"
|
||||
/>
|
||||
</HoverCardTrigger>
|
||||
<OptionHover
|
||||
type="topp"
|
||||
side="left"
|
||||
/>
|
||||
</HoverCard>
|
||||
|
||||
<HoverCard openDelay={300}>
|
||||
<HoverCardTrigger className="grid w-full items-center gap-2">
|
||||
<div className="flex justify-between">
|
||||
<Label
|
||||
htmlFor="top-k-int"
|
||||
className="text-left text-sm font-medium"
|
||||
>
|
||||
Top K <small className="opacity-40">(default: 40)</small>
|
||||
</Label>
|
||||
<InputNumber
|
||||
id="top-k-int"
|
||||
disabled={readonly}
|
||||
value={topK}
|
||||
onChange={value => setTopK(value)}
|
||||
max={40}
|
||||
min={1}
|
||||
step={0.01}
|
||||
controls={false}
|
||||
className={cn(
|
||||
defaultTextProps,
|
||||
cn(
|
||||
optionText,
|
||||
'reset-rc-number-input reset-rc-number-input-text-right h-auto w-12 border-0 group-hover/temp:border-gray-200'
|
||||
)
|
||||
)}
|
||||
/>
|
||||
</div>
|
||||
<Slider
|
||||
disabled={readonly}
|
||||
value={[topK]}
|
||||
onValueChange={value => setTopK(value[0])}
|
||||
doubleClickHandler={() => setTopK(0)}
|
||||
max={40}
|
||||
min={1}
|
||||
step={0.01}
|
||||
className="flex h-4 w-full"
|
||||
/>
|
||||
</HoverCardTrigger>
|
||||
<OptionHover
|
||||
type="topk"
|
||||
side="left"
|
||||
/>
|
||||
</HoverCard>
|
||||
|
||||
<HoverCard openDelay={300}>
|
||||
<HoverCardTrigger className="grid w-full items-center gap-2">
|
||||
<div className="flex justify-between">
|
||||
<Label
|
||||
htmlFor="max-tokens-int"
|
||||
className="text-left text-sm font-medium"
|
||||
>
|
||||
Max Output Tokens <small className="opacity-40">(default: 1024)</small>
|
||||
</Label>
|
||||
<InputNumber
|
||||
id="max-tokens-int"
|
||||
disabled={readonly}
|
||||
value={maxOutputTokens}
|
||||
onChange={value => setMaxOutputTokens(value)}
|
||||
max={1024}
|
||||
min={1}
|
||||
step={1}
|
||||
controls={false}
|
||||
className={cn(
|
||||
defaultTextProps,
|
||||
cn(
|
||||
optionText,
|
||||
'reset-rc-number-input reset-rc-number-input-text-right h-auto w-12 border-0 group-hover/temp:border-gray-200'
|
||||
)
|
||||
)}
|
||||
/>
|
||||
</div>
|
||||
<Slider
|
||||
disabled={readonly}
|
||||
value={[maxOutputTokens]}
|
||||
onValueChange={value => setMaxOutputTokens(value[0])}
|
||||
doubleClickHandler={() => setMaxOutputTokens(0)}
|
||||
max={1024}
|
||||
min={1}
|
||||
step={1}
|
||||
className="flex h-4 w-full"
|
||||
/>
|
||||
</HoverCardTrigger>
|
||||
<OptionHover
|
||||
type="maxoutputtokens"
|
||||
side="left"
|
||||
/>
|
||||
</HoverCard>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
export default Settings;
|
||||
@@ -32,7 +32,7 @@ function Settings(props) {
|
||||
const models = endpointsConfig?.['openAI']?.['availableModels'] || [];
|
||||
|
||||
return (
|
||||
<>
|
||||
<div className="max-h-[350px] overflow-y-auto">
|
||||
<div className="grid gap-6 sm:grid-cols-2">
|
||||
<div className="col-span-1 flex flex-col items-center justify-start gap-6">
|
||||
<div className="grid w-full items-center gap-2">
|
||||
@@ -264,7 +264,7 @@ function Settings(props) {
|
||||
</HoverCard>
|
||||
</div>
|
||||
</div>
|
||||
</>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
|
||||
@@ -2,13 +2,15 @@ import React from 'react';
|
||||
|
||||
import OpenAISettings from './OpenAI/Settings.jsx';
|
||||
import BingAISettings from './BingAI/Settings.jsx';
|
||||
import GoogleSettings from './Google/Settings.jsx';
|
||||
|
||||
// A preset dialog to show readonly preset values.
|
||||
const Settings = ({ preset, ...props }) => {
|
||||
const renderSettings = () => {
|
||||
const { endpoint } = preset || {};
|
||||
// console.log('preset', preset);
|
||||
|
||||
if (endpoint === 'openAI')
|
||||
if (endpoint === 'openAI') {
|
||||
return (
|
||||
<OpenAISettings
|
||||
model={preset?.model}
|
||||
@@ -21,7 +23,7 @@ const Settings = ({ preset, ...props }) => {
|
||||
{...props}
|
||||
/>
|
||||
);
|
||||
else if (endpoint === 'bingAI')
|
||||
} else if (endpoint === 'bingAI') {
|
||||
return (
|
||||
<BingAISettings
|
||||
toneStyle={preset?.toneStyle}
|
||||
@@ -31,7 +33,24 @@ const Settings = ({ preset, ...props }) => {
|
||||
{...props}
|
||||
/>
|
||||
);
|
||||
else return <div className="text-black dark:text-white">Not implemented</div>;
|
||||
} else if (endpoint === 'google') {
|
||||
return (
|
||||
<GoogleSettings
|
||||
model={preset?.model}
|
||||
modelLabel={preset?.modelLabel}
|
||||
promptPrefix={preset?.promptPrefix}
|
||||
examples={preset?.examples}
|
||||
temperature={preset?.temperature}
|
||||
topP={preset?.topP}
|
||||
topK={preset?.topK}
|
||||
maxOutputTokens={preset?.maxOutputTokens}
|
||||
edit={true}
|
||||
{...props}
|
||||
/>
|
||||
);
|
||||
} else {
|
||||
return <div className="text-black dark:text-white">Not implemented</div>;
|
||||
}
|
||||
};
|
||||
|
||||
return renderSettings();
|
||||
|
||||
170
client/src/components/Input/GoogleOptions/index.jsx
Normal file
170
client/src/components/Input/GoogleOptions/index.jsx
Normal file
@@ -0,0 +1,170 @@
|
||||
import { useState } from 'react';
|
||||
import { Settings2 } from 'lucide-react';
|
||||
import { useRecoilState, useRecoilValue } from 'recoil';
|
||||
import MessagesSquared from '~/components/svg/MessagesSquared.jsx';
|
||||
import SelectDropDown from '../../ui/SelectDropDown';
|
||||
import EndpointOptionsPopover from '../../Endpoints/EndpointOptionsPopover';
|
||||
import SaveAsPresetDialog from '../../Endpoints/SaveAsPresetDialog';
|
||||
import { Button } from '../../ui/Button.tsx';
|
||||
import Settings from '../../Endpoints/Google/Settings.jsx';
|
||||
import Examples from '../../Endpoints/Google/Examples.jsx';
|
||||
import { cn } from '~/utils/';
|
||||
|
||||
import store from '~/store';
|
||||
|
||||
function GoogleOptions() {
|
||||
const [advancedMode, setAdvancedMode] = useState(false);
|
||||
const [showExamples, setShowExamples] = useState(false);
|
||||
const [saveAsDialogShow, setSaveAsDialogShow] = useState(false);
|
||||
|
||||
const [conversation, setConversation] = useRecoilState(store.conversation) || {};
|
||||
const { endpoint, conversationId } = conversation;
|
||||
const { model, modelLabel, promptPrefix, examples, temperature, topP, topK, maxOutputTokens } =
|
||||
conversation;
|
||||
|
||||
const endpointsConfig = useRecoilValue(store.endpointsConfig);
|
||||
|
||||
if (endpoint !== 'google') return null;
|
||||
if (conversationId !== 'new') return null;
|
||||
|
||||
const models = endpointsConfig?.['google']?.['availableModels'] || [];
|
||||
|
||||
const triggerAdvancedMode = () => setAdvancedMode(prev => !prev);
|
||||
const triggerExamples = () => setShowExamples(prev => !prev);
|
||||
|
||||
const switchToSimpleMode = () => {
|
||||
setAdvancedMode(false);
|
||||
};
|
||||
|
||||
const saveAsPreset = () => {
|
||||
setSaveAsDialogShow(true);
|
||||
};
|
||||
|
||||
const setOption = param => newValue => {
|
||||
let update = {};
|
||||
update[param] = newValue;
|
||||
setConversation(prevState => ({
|
||||
...prevState,
|
||||
...update
|
||||
}));
|
||||
};
|
||||
|
||||
const setExample = (i, type, newValue = null) => {
|
||||
let update = {};
|
||||
let current = conversation?.examples.slice() || [];
|
||||
let currentExample = { ...current[i] } || {};
|
||||
currentExample[type] = { content: newValue };
|
||||
current[i] = currentExample;
|
||||
update.examples = current;
|
||||
setConversation(prevState => ({
|
||||
...prevState,
|
||||
...update
|
||||
}));
|
||||
};
|
||||
|
||||
const addExample = () => {
|
||||
let update = {};
|
||||
let current = conversation?.examples.slice() || [];
|
||||
current.push({ input: { content: '' }, output: { content: '' } });
|
||||
update.examples = current;
|
||||
setConversation(prevState => ({
|
||||
...prevState,
|
||||
...update
|
||||
}));
|
||||
};
|
||||
|
||||
const removeExample = () => {
|
||||
let update = {};
|
||||
let current = conversation?.examples.slice() || [];
|
||||
if (current.length <= 1) {
|
||||
update.examples = [{ input: { content: '' }, output: { content: '' } }];
|
||||
setConversation(prevState => ({
|
||||
...prevState,
|
||||
...update
|
||||
}));
|
||||
return;
|
||||
}
|
||||
current.pop();
|
||||
update.examples = current;
|
||||
setConversation(prevState => ({
|
||||
...prevState,
|
||||
...update
|
||||
}));
|
||||
};
|
||||
|
||||
const cardStyle =
|
||||
'transition-colors shadow-md rounded-md min-w-[75px] font-normal bg-white border-black/10 hover:border-black/10 focus:border-black/10 dark:border-black/10 dark:hover:border-black/10 dark:focus:border-black/10 border dark:bg-gray-700 text-black dark:text-white';
|
||||
|
||||
return (
|
||||
<>
|
||||
<div
|
||||
className={
|
||||
'openAIOptions-simple-container flex w-full flex-wrap items-center justify-center gap-2' +
|
||||
(!advancedMode ? ' show' : '')
|
||||
}
|
||||
>
|
||||
<SelectDropDown
|
||||
value={model}
|
||||
setValue={setOption('model')}
|
||||
availableValues={models}
|
||||
showAbove={true}
|
||||
showLabel={false}
|
||||
className={cn(
|
||||
cardStyle,
|
||||
'min-w-48 z-50 flex h-[40px] w-48 flex-none items-center justify-center px-4 ring-0 hover:cursor-pointer hover:bg-slate-50 focus:ring-0 focus:ring-offset-0 data-[state=open]:bg-slate-50 dark:bg-gray-700 dark:hover:bg-gray-600 dark:data-[state=open]:bg-gray-600'
|
||||
)}
|
||||
/>
|
||||
<Button
|
||||
type="button"
|
||||
className={cn(
|
||||
cardStyle,
|
||||
'min-w-4 z-50 flex h-[40px] flex-none items-center justify-center px-4 hover:bg-slate-50 focus:ring-0 focus:ring-offset-0 dark:hover:bg-gray-600'
|
||||
)}
|
||||
onClick={triggerAdvancedMode}
|
||||
>
|
||||
<Settings2 className="w-4 text-gray-600 dark:text-white" />
|
||||
</Button>
|
||||
</div>
|
||||
<EndpointOptionsPopover
|
||||
content={
|
||||
<div className="px-4 py-4">
|
||||
{showExamples ? (
|
||||
<Examples
|
||||
examples={examples}
|
||||
setExample={setExample}
|
||||
addExample={addExample}
|
||||
removeExample={removeExample}
|
||||
/>
|
||||
) : (
|
||||
<Settings
|
||||
model={model}
|
||||
modelLabel={modelLabel}
|
||||
promptPrefix={promptPrefix}
|
||||
temperature={temperature}
|
||||
topP={topP}
|
||||
topK={topK}
|
||||
maxOutputTokens={maxOutputTokens}
|
||||
setOption={setOption}
|
||||
/>
|
||||
)}
|
||||
</div>
|
||||
}
|
||||
visible={advancedMode}
|
||||
saveAsPreset={saveAsPreset}
|
||||
switchToSimpleMode={switchToSimpleMode}
|
||||
additionalButton={{
|
||||
label: (showExamples ? 'Hide' : 'Show') + ' Examples',
|
||||
handler: triggerExamples,
|
||||
icon: <MessagesSquared className="mr-1 w-[14px]" />
|
||||
}}
|
||||
/>
|
||||
<SaveAsPresetDialog
|
||||
open={saveAsDialogShow}
|
||||
onOpenChange={setSaveAsDialogShow}
|
||||
preset={conversation}
|
||||
/>
|
||||
</>
|
||||
);
|
||||
}
|
||||
|
||||
export default GoogleOptions;
|
||||
@@ -7,6 +7,14 @@ import SetTokenDialog from '../SetTokenDialog';
|
||||
|
||||
import store from '../../../store';
|
||||
|
||||
const alternateName = {
|
||||
openAI: 'OpenAI',
|
||||
azureOpenAI: 'Azure OpenAI',
|
||||
bingAI: 'Bing',
|
||||
chatGPTBrowser: 'ChatGPT',
|
||||
google: 'PaLM',
|
||||
}
|
||||
|
||||
export default function ModelItem({ endpoint, value, onSelect }) {
|
||||
const [setTokenDialogOpen, setSetTokenDialogOpen] = useState(false);
|
||||
const endpointsConfig = useRecoilValue(store.endpointsConfig);
|
||||
@@ -28,7 +36,7 @@ export default function ModelItem({ endpoint, value, onSelect }) {
|
||||
className="group dark:font-semibold dark:text-gray-100 dark:hover:bg-gray-800"
|
||||
>
|
||||
{icon}
|
||||
{endpoint}
|
||||
{alternateName[endpoint] || endpoint}
|
||||
{!!['azureOpenAI', 'openAI'].find(e => e === endpoint) && <sup>$</sup>}
|
||||
<div className="flex w-4 flex-1" />
|
||||
{isuserProvide ? (
|
||||
|
||||
@@ -1,13 +1,10 @@
|
||||
import React from 'react';
|
||||
import { useState } from 'react';
|
||||
import { FileUp } from 'lucide-react';
|
||||
import cleanupPreset from '~/utils/cleanupPreset.js';
|
||||
import { useRecoilValue } from 'recoil';
|
||||
import { cn } from '~/utils/';
|
||||
|
||||
import store from '~/store';
|
||||
|
||||
const FileUpload = ({ onFileSelected }) => {
|
||||
// const setPresets = useSetRecoilState(store.presets);
|
||||
const endpointsConfig = useRecoilValue(store.endpointsConfig);
|
||||
const FileUpload = ({ onFileSelected, successText = null, invalidText = null, validator = null, text = null, id = '1' }) => {
|
||||
const [statusColor, setStatusColor] = useState('text-gray-600');
|
||||
const [status, setStatus] = useState(null);
|
||||
|
||||
const handleFileChange = event => {
|
||||
const file = event.target.files[0];
|
||||
@@ -16,20 +13,34 @@ const FileUpload = ({ onFileSelected }) => {
|
||||
const reader = new FileReader();
|
||||
reader.onload = e => {
|
||||
const jsonData = JSON.parse(e.target.result);
|
||||
onFileSelected({ ...cleanupPreset({ preset: jsonData, endpointsConfig }), presetId: null });
|
||||
if (validator && !validator(jsonData)) {
|
||||
setStatus('invalid');
|
||||
setStatusColor('text-red-600');
|
||||
return;
|
||||
}
|
||||
|
||||
if (validator) {
|
||||
setStatus('success');
|
||||
setStatusColor('text-green-500 dark:text-green-500');
|
||||
}
|
||||
|
||||
onFileSelected(jsonData);
|
||||
};
|
||||
reader.readAsText(file);
|
||||
};
|
||||
|
||||
return (
|
||||
<label
|
||||
htmlFor="file-upload"
|
||||
className=" mr-1 flex h-auto cursor-pointer items-center rounded bg-transparent px-2 py-1 text-xs font-medium font-normal text-gray-600 transition-colors hover:bg-slate-200 hover:text-green-700 dark:bg-transparent dark:text-gray-300 dark:hover:bg-gray-800 dark:hover:text-green-500"
|
||||
htmlFor={`file-upload-${id}`}
|
||||
className={cn(
|
||||
'mr-1 flex h-auto cursor-pointer items-center rounded bg-transparent px-2 py-1 text-xs font-medium font-normal transition-colors hover:bg-slate-200 hover:text-green-700 dark:bg-transparent dark:text-gray-300 dark:hover:bg-gray-800 dark:hover:text-green-500',
|
||||
statusColor
|
||||
)}
|
||||
>
|
||||
<FileUp className="mr-1 flex w-[22px] items-center stroke-1" />
|
||||
<span className="flex text-xs ">Import</span>
|
||||
<span className="flex text-xs ">{!status ? text || 'Import' : (status === 'success' ? successText : invalidText)}</span>
|
||||
<input
|
||||
id="file-upload"
|
||||
id={`file-upload-${id}`}
|
||||
value=""
|
||||
type="file"
|
||||
className="hidden "
|
||||
|
||||
@@ -10,6 +10,7 @@ export default function PresetItem({ preset = {}, value, onSelect, onChangePrese
|
||||
const icon = getIcon({
|
||||
size: 20,
|
||||
endpoint: preset?.endpoint,
|
||||
model: preset?.model,
|
||||
error: false,
|
||||
className: 'mr-2'
|
||||
});
|
||||
@@ -21,6 +22,10 @@ export default function PresetItem({ preset = {}, value, onSelect, onChangePrese
|
||||
const { chatGptLabel, model } = preset;
|
||||
if (model) _title += `: ${model}`;
|
||||
if (chatGptLabel) _title += ` as ${chatGptLabel}`;
|
||||
} else if (endpoint === 'google') {
|
||||
const { modelLabel, model } = preset;
|
||||
if (model) _title += `: ${model}`;
|
||||
if (modelLabel) _title += ` as ${modelLabel}`;
|
||||
} else if (endpoint === 'bingAI') {
|
||||
const { jailbreak, toneStyle } = preset;
|
||||
if (toneStyle) _title += `: ${toneStyle}`;
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
import React, { useState, useEffect } from 'react';
|
||||
import cleanupPreset from '~/utils/cleanupPreset.js';
|
||||
import { useRecoilValue, useRecoilState } from 'recoil';
|
||||
import EditPresetDialog from '../../Endpoints/EditPresetDialog';
|
||||
import EndpointItems from './EndpointItems';
|
||||
@@ -23,10 +24,12 @@ import store from '~/store';
|
||||
|
||||
export default function NewConversationMenu() {
|
||||
const [menuOpen, setMenuOpen] = useState(false);
|
||||
const [showPresets, setShowPresets] = useState(true);
|
||||
const [presetModelVisible, setPresetModelVisible] = useState(false);
|
||||
const [preset, setPreset] = useState(false);
|
||||
|
||||
const availableEndpoints = useRecoilValue(store.availableEndpoints);
|
||||
const endpointsConfig = useRecoilValue(store.endpointsConfig);
|
||||
const [presets, setPresets] = useRecoilState(store.presets);
|
||||
|
||||
const conversation = useRecoilValue(store.conversation) || {};
|
||||
@@ -50,6 +53,11 @@ export default function NewConversationMenu() {
|
||||
);
|
||||
};
|
||||
|
||||
const onFileSelected = jsonData => {
|
||||
const jsonPreset = { ...cleanupPreset({ preset: jsonData, endpointsConfig }), presetId: null };
|
||||
importPreset(jsonPreset);
|
||||
};
|
||||
|
||||
// update the default model when availableModels changes
|
||||
// typically, availableModels changes => modelsFilter or customGPTModels changes
|
||||
useEffect(() => {
|
||||
@@ -64,7 +72,10 @@ export default function NewConversationMenu() {
|
||||
if (endpoint) {
|
||||
const lastSelectedModel = JSON.parse(localStorage.getItem('lastSelectedModel')) || {};
|
||||
localStorage.setItem('lastConversationSetup', JSON.stringify(conversation));
|
||||
localStorage.setItem('lastSelectedModel', JSON.stringify({ ...lastSelectedModel, [endpoint] : conversation.model }));
|
||||
localStorage.setItem(
|
||||
'lastSelectedModel',
|
||||
JSON.stringify({ ...lastSelectedModel, [endpoint]: conversation.model })
|
||||
);
|
||||
}
|
||||
}, [conversation]);
|
||||
|
||||
@@ -149,9 +160,14 @@ export default function NewConversationMenu() {
|
||||
<div className="mt-6 w-full" />
|
||||
|
||||
<DropdownMenuLabel className="flex items-center dark:text-gray-300">
|
||||
<span>Select a Preset</span>
|
||||
<span
|
||||
className="cursor-pointer"
|
||||
onClick={() => setShowPresets(prev => !prev)}
|
||||
>
|
||||
{showPresets ? 'Hide ' : 'Show '} Presets
|
||||
</span>
|
||||
<div className="flex-1" />
|
||||
<FileUpload onFileSelected={importPreset} />
|
||||
<FileUpload onFileSelected={onFileSelected} />
|
||||
<Dialog>
|
||||
<DialogTrigger asChild>
|
||||
<label
|
||||
@@ -181,18 +197,19 @@ export default function NewConversationMenu() {
|
||||
<DropdownMenuSeparator />
|
||||
<DropdownMenuRadioGroup
|
||||
onValueChange={onSelectPreset}
|
||||
className="overflow-y-auto"
|
||||
className="max-h-[150px] overflow-y-auto"
|
||||
>
|
||||
{presets.length ? (
|
||||
<PresetItems
|
||||
presets={presets}
|
||||
onSelect={onSelectPreset}
|
||||
onChangePreset={onChangePreset}
|
||||
onDeletePreset={onDeletePreset}
|
||||
/>
|
||||
) : (
|
||||
<DropdownMenuLabel className="dark:text-gray-300">No preset yet.</DropdownMenuLabel>
|
||||
)}
|
||||
{showPresets &&
|
||||
(presets.length ? (
|
||||
<PresetItems
|
||||
presets={presets}
|
||||
onSelect={onSelectPreset}
|
||||
onChangePreset={onChangePreset}
|
||||
onDeletePreset={onDeletePreset}
|
||||
/>
|
||||
) : (
|
||||
<DropdownMenuLabel className="dark:text-gray-300">No preset yet.</DropdownMenuLabel>
|
||||
))}
|
||||
</DropdownMenuRadioGroup>
|
||||
</DropdownMenuContent>
|
||||
</DropdownMenu>
|
||||
|
||||
@@ -1,12 +1,10 @@
|
||||
import React, { useEffect, useState } from 'react';
|
||||
import { useRecoilValue } from 'recoil';
|
||||
import DialogTemplate from '../../ui/DialogTemplate';
|
||||
import { Dialog } from '../../ui/Dialog.tsx';
|
||||
import { Input } from '../../ui/Input.tsx';
|
||||
import { Label } from '../../ui/Label.tsx';
|
||||
import { cn } from '~/utils/';
|
||||
import cleanupPreset from '~/utils/cleanupPreset';
|
||||
import { useCreatePresetMutation } from '~/data-provider';
|
||||
import FileUpload from '../NewConversationMenu/FileUpload';
|
||||
import store from '~/store';
|
||||
|
||||
const SetTokenDialog = ({ open, onOpenChange, endpoint }) => {
|
||||
@@ -54,6 +52,30 @@ const SetTokenDialog = ({ open, onOpenChange, endpoint }) => {
|
||||
</a>
|
||||
. Copy access token.
|
||||
</small>
|
||||
),
|
||||
google: (
|
||||
<small className="break-all text-gray-600">
|
||||
You need to{' '}
|
||||
<a
|
||||
target="_blank"
|
||||
href="https://console.cloud.google.com/vertex-ai"
|
||||
rel="noreferrer"
|
||||
className="text-blue-600 underline"
|
||||
>
|
||||
Enable Vertex AI
|
||||
</a>{' '}
|
||||
API on Google Cloud, then{' '}
|
||||
<a
|
||||
target="_blank"
|
||||
href="https://console.cloud.google.com/projectselector/iam-admin/serviceaccounts/create?walkthrough_id=iam--create-service-account#step_index=1"
|
||||
rel="noreferrer"
|
||||
className="text-blue-600 underline"
|
||||
>
|
||||
Create a Service Account
|
||||
</a>
|
||||
. Make sure to click 'Create and Continue' to give at least the 'Vertex AI User' role. Lastly, create
|
||||
a JSON key to import here.
|
||||
</small>
|
||||
)
|
||||
};
|
||||
|
||||
@@ -73,19 +95,61 @@ const SetTokenDialog = ({ open, onOpenChange, endpoint }) => {
|
||||
Token Name
|
||||
<br />
|
||||
</Label>
|
||||
<Input
|
||||
id="chatGptLabel"
|
||||
value={token || ''}
|
||||
onChange={e => setToken(e.target.value || '')}
|
||||
placeholder="Set the token."
|
||||
className={cn(
|
||||
defaultTextProps,
|
||||
'flex h-10 max-h-10 w-full resize-none px-3 py-2 focus:outline-none focus:ring-0 focus:ring-opacity-0 focus:ring-offset-0'
|
||||
)}
|
||||
/>
|
||||
<small className="text-red-600">
|
||||
Your token will be sent to the server, but not saved.
|
||||
</small>
|
||||
{endpoint === 'google' ? (
|
||||
<FileUpload
|
||||
id="googleKey"
|
||||
className="w-full"
|
||||
text="Import Service Account JSON Key"
|
||||
successText="Successfully Imported Service Account JSON Key"
|
||||
invalidText="Invalid Service Account JSON Key, Did you import the correct file?"
|
||||
validator={credentials => {
|
||||
if (!credentials) {
|
||||
return false;
|
||||
}
|
||||
|
||||
if (
|
||||
!credentials.client_email ||
|
||||
typeof credentials.client_email !== 'string' ||
|
||||
credentials.client_email.length <= 2
|
||||
) {
|
||||
return false;
|
||||
}
|
||||
|
||||
if (
|
||||
!credentials.project_id ||
|
||||
typeof credentials.project_id !== 'string' ||
|
||||
credentials.project_id.length <= 2
|
||||
) {
|
||||
return false;
|
||||
}
|
||||
|
||||
if (
|
||||
!credentials.private_key ||
|
||||
typeof credentials.private_key !== 'string' ||
|
||||
credentials.private_key.length <= 600
|
||||
) {
|
||||
return false;
|
||||
}
|
||||
|
||||
return true;
|
||||
}}
|
||||
onFileSelected={data => {
|
||||
setToken(JSON.stringify(data));
|
||||
}}
|
||||
/>
|
||||
) : (
|
||||
<Input
|
||||
id="chatGptLabel"
|
||||
value={token || ''}
|
||||
onChange={e => setToken(e.target.value || '')}
|
||||
placeholder="Set the token."
|
||||
className={cn(
|
||||
defaultTextProps,
|
||||
'flex h-10 max-h-10 w-full resize-none px-3 py-2 focus:outline-none focus:ring-0 focus:ring-opacity-0 focus:ring-offset-0'
|
||||
)}
|
||||
/>
|
||||
)}
|
||||
<small className="text-red-600">Your token will be sent to the server, but not saved.</small>
|
||||
{helpText?.[endpoint]}
|
||||
</div>
|
||||
}
|
||||
|
||||
@@ -4,6 +4,7 @@ import SubmitButton from './SubmitButton';
|
||||
import OpenAIOptions from './OpenAIOptions';
|
||||
import ChatGPTOptions from './ChatGPTOptions';
|
||||
import BingAIOptions from './BingAIOptions';
|
||||
import GoogleOptions from './GoogleOptions';
|
||||
// import BingStyles from './BingStyles';
|
||||
import NewConversationMenu from './NewConversationMenu';
|
||||
import AdjustToneButton from './AdjustToneButton';
|
||||
@@ -139,6 +140,7 @@ export default function TextChat({ isSearchView = false }) {
|
||||
<span className="flex w-full flex-col items-center justify-center gap-0 md:order-none md:m-auto md:gap-2">
|
||||
<OpenAIOptions />
|
||||
<ChatGPTOptions />
|
||||
<GoogleOptions />
|
||||
<BingAIOptions show={showBingToneSetting} />
|
||||
</span>
|
||||
</div>
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
import React from 'react';
|
||||
import Clipboard from '../svg/Clipboard';
|
||||
import CheckMark from '../svg/CheckMark';
|
||||
import EditIcon from '../svg/EditIcon';
|
||||
import RegenerateIcon from '../svg/RegenerateIcon';
|
||||
|
||||
@@ -13,10 +14,11 @@ export default function HoverButtons({
|
||||
regenerate
|
||||
}) {
|
||||
const { endpoint, jailbreak = false } = conversation;
|
||||
const [isCopied, setIsCopied] = React.useState(false);
|
||||
|
||||
const branchingSupported =
|
||||
// azureOpenAI, openAI, chatGPTBrowser support branching, so edit enabled
|
||||
!!['azureOpenAI', 'openAI', 'chatGPTBrowser'].find(e => e === endpoint) ||
|
||||
!!['azureOpenAI', 'openAI', 'chatGPTBrowser', 'google'].find(e => e === endpoint) ||
|
||||
// Sydney in bingAI supports branching, so edit enabled
|
||||
(endpoint === 'bingAI' && jailbreak);
|
||||
|
||||
@@ -59,11 +61,11 @@ export default function HoverButtons({
|
||||
|
||||
<button
|
||||
className="hover-button rounded-md p-1 hover:bg-gray-100 hover:text-gray-700 dark:text-gray-400 dark:hover:bg-gray-700 dark:hover:text-gray-200 disabled:dark:hover:text-gray-400 md:invisible md:group-hover:visible"
|
||||
onClick={copyToClipboard}
|
||||
onClick={() => copyToClipboard(setIsCopied)}
|
||||
type="button"
|
||||
title="copy to clipboard"
|
||||
title={isCopied ? 'Copied to clipboard' : 'Copy to clipboard'}
|
||||
>
|
||||
<Clipboard />
|
||||
{isCopied ? <CheckMark /> : <Clipboard />}
|
||||
</button>
|
||||
</div>
|
||||
);
|
||||
|
||||
@@ -98,8 +98,13 @@ export default function Message({
|
||||
if (!isSubmitting && !message?.isCreatedByUser) regenerate(message);
|
||||
};
|
||||
|
||||
const copyToClipboard = () => {
|
||||
const copyToClipboard = (setIsCopied) => {
|
||||
setIsCopied(true);
|
||||
copy(message?.text);
|
||||
|
||||
setTimeout(() => {
|
||||
setIsCopied(false);
|
||||
}, 3000);
|
||||
};
|
||||
|
||||
const clickSearchResult = async () => {
|
||||
@@ -217,7 +222,7 @@ export default function Message({
|
||||
conversation={conversation}
|
||||
enterEdit={() => enterEdit()}
|
||||
regenerate={() => regenerateMessage()}
|
||||
copyToClipboard={() => copyToClipboard()}
|
||||
copyToClipboard={copyToClipboard}
|
||||
/>
|
||||
<SubRow subclasses="switch-container">
|
||||
<SiblingSwitch
|
||||
|
||||
@@ -31,6 +31,11 @@ const MessageHeader = ({ isSearchView = false }) => {
|
||||
const { chatGptLabel, model } = conversation;
|
||||
if (model) _title += `: ${model}`;
|
||||
if (chatGptLabel) _title += ` as ${chatGptLabel}`;
|
||||
} else if (endpoint === 'google') {
|
||||
_title = 'PaLM';
|
||||
const { modelLabel, model } = conversation;
|
||||
if (model) _title += `: ${model}`;
|
||||
if (modelLabel) _title += ` as ${modelLabel}`;
|
||||
} else if (endpoint === 'bingAI') {
|
||||
const { jailbreak, toneStyle } = conversation;
|
||||
if (toneStyle) _title += `: ${toneStyle}`;
|
||||
|
||||
@@ -26,8 +26,7 @@ export default function ClearConvos() {
|
||||
<Dialog>
|
||||
<DialogTrigger asChild>
|
||||
<button
|
||||
className="flex cursor-pointer items-center gap-3 rounded-md py-3 px-3 text-sm text-white transition-colors duration-200 hover:bg-gray-500/10"
|
||||
// onClick={clickHandler}
|
||||
className="flex w-full cursor-pointer items-center gap-3 px-3 py-3 text-sm text-white transition-colors duration-200 hover:bg-gray-700"
|
||||
>
|
||||
<TrashIcon />
|
||||
Clear conversations
|
||||
|
||||
@@ -11,7 +11,7 @@ export default function DarkMode() {
|
||||
|
||||
return (
|
||||
<button
|
||||
className="flex cursor-pointer items-center gap-3 rounded-md py-3 px-3 text-sm text-white transition-colors duration-200 hover:bg-gray-500/10"
|
||||
className="flex w-full cursor-pointer items-center gap-3 px-3 py-3 text-sm text-white transition-colors duration-200 hover:bg-gray-700"
|
||||
onClick={clickHandler}
|
||||
>
|
||||
{theme === 'dark' ? <LightModeIcon /> : <DarkModeIcon />}
|
||||
|
||||
@@ -25,7 +25,7 @@ export default function ExportConversation() {
|
||||
<>
|
||||
<button
|
||||
className={cn(
|
||||
'flex items-center gap-3 rounded-md py-3 px-3 text-sm transition-colors duration-200 hover:bg-gray-500/10',
|
||||
'flex py-3 px-3 items-center gap-3 transition-colors duration-200 text-white cursor-pointer text-sm hover:bg-gray-700 w-full',
|
||||
exportable ? 'cursor-pointer text-white' : 'cursor-not-allowed text-gray-400'
|
||||
)}
|
||||
onClick={clickHandler}
|
||||
|
||||
@@ -12,7 +12,7 @@ export default function Logout() {
|
||||
|
||||
return (
|
||||
<button
|
||||
className="flex cursor-pointer items-center gap-3 rounded-md py-3 px-3 text-sm text-white transition-colors duration-200 hover:bg-gray-500/10"
|
||||
className="flex py-3 px-3 items-center gap-3 transition-colors duration-200 text-white cursor-pointer text-sm hover:bg-gray-700 w-full"
|
||||
onClick={handleLogout}
|
||||
>
|
||||
<LogOutIcon />
|
||||
|
||||
@@ -1,21 +1,77 @@
|
||||
import { Menu, Transition } from '@headlessui/react';
|
||||
import { Fragment, useEffect, useRef, useState } from 'react';
|
||||
import SearchBar from './SearchBar';
|
||||
import ClearConvos from './ClearConvos';
|
||||
import DarkMode from './DarkMode';
|
||||
import Logout from './Logout';
|
||||
import ExportConversation from './ExportConversation';
|
||||
import { useAuthContext } from '~/hooks/AuthContext';
|
||||
import { cn } from '~/utils/';
|
||||
import DotsIcon from '../svg/DotsIcon';
|
||||
|
||||
export default function NavLinks({ clearSearch, isSearchEnabled }) {
|
||||
const { user, logout } = useAuthContext();
|
||||
return (
|
||||
<>
|
||||
{!!isSearchEnabled && (
|
||||
<SearchBar
|
||||
clearSearch={clearSearch}
|
||||
/>
|
||||
<Menu
|
||||
as="div"
|
||||
className="group relative"
|
||||
>
|
||||
{({ open }) => (
|
||||
<>
|
||||
<Menu.Button
|
||||
className={cn(
|
||||
'group-ui-open:bg-gray-800 flex w-full items-center gap-2.5 rounded-md px-3 py-3 text-sm transition-colors duration-200 hover:bg-gray-800',
|
||||
open ? 'bg-gray-800' : ''
|
||||
)}
|
||||
>
|
||||
<div className="-ml-0.5 h-5 w-5 flex-shrink-0">
|
||||
<div className="relative flex">
|
||||
<img
|
||||
className="rounded-sm"
|
||||
src={user?.avatar || `https://avatars.dicebear.com/api/initials/${user?.name}.svg`}
|
||||
alt=""
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
<div className="grow overflow-hidden text-ellipsis whitespace-nowrap text-left text-white">
|
||||
{user?.name || 'USER'}
|
||||
</div>
|
||||
<DotsIcon />
|
||||
</Menu.Button>
|
||||
|
||||
<Transition
|
||||
as={Fragment}
|
||||
enter="transition ease-out duration-100"
|
||||
enterFrom="transform opacity-0 scale-95"
|
||||
enterTo="transform opacity-100 scale-100"
|
||||
leave="transition ease-in duration-75"
|
||||
leaveFrom="transform opacity-100 scale-100"
|
||||
leaveTo="transform opacity-0 scale-95"
|
||||
>
|
||||
<Menu.Items className="absolute bottom-full left-0 z-20 mb-2 w-full translate-y-0 overflow-hidden rounded-md bg-[#050509] py-1.5 opacity-100 outline-none">
|
||||
<Menu.Item>
|
||||
{({}) => <>{!!isSearchEnabled && <SearchBar clearSearch={clearSearch} />}</>}
|
||||
</Menu.Item>
|
||||
<Menu.Item>{({}) => <ExportConversation />}</Menu.Item>
|
||||
|
||||
<div
|
||||
className="my-1.5 h-px bg-white/20"
|
||||
role="none"
|
||||
></div>
|
||||
<Menu.Item>{({}) => <DarkMode />}</Menu.Item>
|
||||
<Menu.Item>{({}) => <ClearConvos />}</Menu.Item>
|
||||
|
||||
<div
|
||||
className="my-1.5 h-px bg-white/20"
|
||||
role="none"
|
||||
></div>
|
||||
<Menu.Item>
|
||||
<Logout />
|
||||
</Menu.Item>
|
||||
</Menu.Items>
|
||||
</Transition>
|
||||
</>
|
||||
)}
|
||||
<ExportConversation />
|
||||
<DarkMode />
|
||||
<ClearConvos />
|
||||
<Logout />
|
||||
</>
|
||||
</Menu>
|
||||
);
|
||||
}
|
||||
|
||||
@@ -142,8 +142,8 @@ export default function Nav({ navVisible, setNavVisible }) {
|
||||
}
|
||||
>
|
||||
<div className="flex h-full min-h-0 flex-col ">
|
||||
<div className="scrollbar-trigger flex h-full w-full flex-1 items-start border-white/20">
|
||||
<nav className="flex h-full flex-1 flex-col space-y-1 p-2">
|
||||
<div className="scrollbar-trigger flex h-full w-full flex-1 items-start border-white/20 relative">
|
||||
<nav className="flex h-full flex-1 flex-col space-y-1 p-2 relative">
|
||||
<NewChat />
|
||||
<div
|
||||
className={`flex-1 flex-col overflow-y-auto ${
|
||||
|
||||
34
client/src/components/svg/DotsIcon.jsx
Normal file
34
client/src/components/svg/DotsIcon.jsx
Normal file
@@ -0,0 +1,34 @@
|
||||
import React from 'react';
|
||||
|
||||
export default function DotsIcon() {
|
||||
return (
|
||||
<svg
|
||||
stroke="currentColor"
|
||||
fill="none"
|
||||
strokeWidth="2"
|
||||
viewBox="0 0 24 24"
|
||||
strokeLinecap="round"
|
||||
strokeLinejoin="round"
|
||||
className="h-4 w-4 flex-shrink-0 text-gray-500"
|
||||
height="1em"
|
||||
width="1em"
|
||||
xmlns="http://www.w3.org/2000/svg"
|
||||
>
|
||||
<circle
|
||||
cx="12"
|
||||
cy="12"
|
||||
r="1"
|
||||
/>
|
||||
<circle
|
||||
cx="19"
|
||||
cy="12"
|
||||
r="1"
|
||||
/>
|
||||
<circle
|
||||
cx="5"
|
||||
cy="12"
|
||||
r="1"
|
||||
/>
|
||||
</svg>
|
||||
);
|
||||
}
|
||||
21
client/src/components/svg/MessagesSquared.jsx
Normal file
21
client/src/components/svg/MessagesSquared.jsx
Normal file
@@ -0,0 +1,21 @@
|
||||
import { cn } from '~/utils/';
|
||||
|
||||
export default function MessagesSquared({ className }) {
|
||||
return (
|
||||
<svg
|
||||
xmlns="http://www.w3.org/2000/svg"
|
||||
width="24"
|
||||
height="24"
|
||||
viewBox="0 0 24 24"
|
||||
fill="none"
|
||||
stroke="currentColor"
|
||||
strokeWidth="2"
|
||||
strokeLinecap="round"
|
||||
strokeLinejoin="round"
|
||||
className={cn(className, 'lucide lucide-messages-square')}
|
||||
>
|
||||
<path d="M14 9a2 2 0 0 1-2 2H6l-4 4V4c0-1.1.9-2 2-2h8a2 2 0 0 1 2 2v5Z" />
|
||||
<path d="M18 9h2a2 2 0 0 1 2 2v11l-4-4h-6a2 2 0 0 1-2-2v-1" />
|
||||
</svg>
|
||||
);
|
||||
}
|
||||
@@ -29,26 +29,6 @@ export default function DialogTemplate({
|
||||
<DialogTitle className="text-gray-800 dark:text-white">{title}</DialogTitle>
|
||||
<DialogDescription className="text-gray-600 dark:text-gray-300">{description}</DialogDescription>
|
||||
</DialogHeader>
|
||||
{/* <div className="grid gap-4 py-4">
|
||||
<div className="grid grid-cols-4 items-center gap-4"> //input template
|
||||
|
||||
</div>
|
||||
<div className="grid grid-cols-4 items-center gap-4">
|
||||
<Label
|
||||
htmlFor="promptPrefix"
|
||||
className="text-right"
|
||||
>
|
||||
Prompt Prefix
|
||||
</Label>
|
||||
<TextareaAutosize
|
||||
id="promptPrefix"
|
||||
value={promptPrefix}
|
||||
onChange={(e) => setPromptPrefix(e.target.value)}
|
||||
placeholder="Set custom instructions. Defaults to: 'You are ChatGPT, a large language model trained by OpenAI.'"
|
||||
className="col-span-3 flex h-20 w-full resize-none rounded-md border border-gray-300 bg-transparent py-2 px-3 text-sm shadow-[0_0_10px_rgba(0,0,0,0.10)] outline-none placeholder:text-gray-400 focus:outline-none focus:ring-gray-400 focus:ring-opacity-20 focus:ring-offset-2 disabled:cursor-not-allowed disabled:opacity-50 dark:border-none dark:bg-gray-700 dark:text-gray-50 dark:shadow-[0_0_15px_rgba(0,0,0,0.10)] dark:focus:border-none dark:focus:border-transparent dark:focus:outline-none dark:focus:ring-0 dark:focus:ring-gray-400 dark:focus:ring-offset-0"
|
||||
/>
|
||||
</div>
|
||||
</div> */}
|
||||
{main ? main : null}
|
||||
<DialogFooter>
|
||||
<div>{leftButtons ? leftButtons : null}</div>
|
||||
|
||||
@@ -8,6 +8,7 @@ export default function createPayload(submission: TSubmission) {
|
||||
const endpointUrlMap = {
|
||||
azureOpenAI: '/api/ask/azureOpenAI',
|
||||
openAI: '/api/ask/openAI',
|
||||
google: '/api/ask/google',
|
||||
bingAI: '/api/ask/bingAI',
|
||||
chatGPTBrowser: '/api/ask/chatGPTBrowser'
|
||||
};
|
||||
|
||||
@@ -11,6 +11,11 @@ export type TMessage = {
|
||||
updatedAt: string,
|
||||
};
|
||||
|
||||
export type TExample = {
|
||||
input: string,
|
||||
output: string,
|
||||
};
|
||||
|
||||
export type TSubmission = {
|
||||
clientId?: string;
|
||||
context?: string;
|
||||
@@ -43,7 +48,8 @@ export enum EModelEndpoint {
|
||||
openAI = 'openAI',
|
||||
bingAI = 'bingAI',
|
||||
chatGPT = 'chatGPT',
|
||||
chatGPTBrowser = 'chatGPTBrowser'
|
||||
chatGPTBrowser = 'chatGPTBrowser',
|
||||
google = 'google',
|
||||
}
|
||||
|
||||
export type TConversation = {
|
||||
@@ -55,11 +61,19 @@ export type TConversation = {
|
||||
messages?: TMessage[];
|
||||
createdAt: string;
|
||||
updatedAt: string;
|
||||
// google only
|
||||
modelLabel?: string;
|
||||
examples?: TExample[];
|
||||
// for azureOpenAI, openAI only
|
||||
chatGptLabel?: string;
|
||||
userLabel?: string;
|
||||
model?: string;
|
||||
promptPrefix?: string;
|
||||
temperature?: number;
|
||||
topP?: number;
|
||||
topK?: number;
|
||||
// bing and google
|
||||
context?: string;
|
||||
top_p?: number;
|
||||
presence_penalty?: number;
|
||||
// for bingAI only
|
||||
|
||||
@@ -6,7 +6,8 @@ const endpointsConfig = atom({
|
||||
azureOpenAI: null,
|
||||
openAI: null,
|
||||
bingAI: null,
|
||||
chatGPTBrowser: null
|
||||
chatGPTBrowser: null,
|
||||
google: null,
|
||||
}
|
||||
});
|
||||
|
||||
@@ -24,7 +25,7 @@ const endpointsFilter = selector({
|
||||
const availableEndpoints = selector({
|
||||
key: 'availableEndpoints',
|
||||
get: ({ get }) => {
|
||||
const endpoints = ['azureOpenAI', 'openAI', 'bingAI', 'chatGPTBrowser'];
|
||||
const endpoints = ['azureOpenAI', 'openAI', 'chatGPTBrowser', 'bingAI', 'google'];
|
||||
const f = get(endpointsFilter);
|
||||
return endpoints.filter(endpoint => f[endpoint]);
|
||||
}
|
||||
|
||||
@@ -2,6 +2,110 @@
|
||||
@tailwind components;
|
||||
@tailwind utilities;
|
||||
|
||||
@font-face {
|
||||
font-display: swap;
|
||||
font-family: Signifier;
|
||||
font-style: normal;
|
||||
font-weight: 400;
|
||||
src: url("../fonts/signifier-light.woff2") format("woff2")
|
||||
}
|
||||
|
||||
@font-face {
|
||||
font-display: swap;
|
||||
font-family: Signifier;
|
||||
font-style: italic;
|
||||
font-weight: 400;
|
||||
src: url("../fonts/signifier-light-italic.woff2") format("woff2")
|
||||
}
|
||||
|
||||
@font-face {
|
||||
font-display: swap;
|
||||
font-family: Signifier;
|
||||
font-style: normal;
|
||||
font-weight: 700;
|
||||
src: url("../fonts/signifier-bold.woff2") format("woff2")
|
||||
}
|
||||
|
||||
@font-face {
|
||||
font-display: swap;
|
||||
font-family: Signifier;
|
||||
font-style: italic;
|
||||
font-weight: 700;
|
||||
src: url("../fonts/signifier-bold-italic.woff2") format("woff2")
|
||||
}
|
||||
|
||||
@font-face {
|
||||
font-display: swap;
|
||||
font-family: Söhne;
|
||||
font-style: normal;
|
||||
font-weight: 400;
|
||||
src: url("../fonts/soehne-buch.woff2") format("woff2")
|
||||
}
|
||||
|
||||
@font-face {
|
||||
font-display: swap;
|
||||
font-family: Söhne;
|
||||
font-style: italic;
|
||||
font-weight: 400;
|
||||
src: url("../fonts/soehne-buch-kursiv.woff2") format("woff2")
|
||||
}
|
||||
|
||||
@font-face {
|
||||
font-display: swap;
|
||||
font-family: Söhne;
|
||||
font-style: normal;
|
||||
font-weight: 500;
|
||||
src: url("../fonts/soehne-kraftig.woff2") format("woff2")
|
||||
}
|
||||
|
||||
@font-face {
|
||||
font-display: swap;
|
||||
font-family: Söhne;
|
||||
font-style: italic;
|
||||
font-weight: 500;
|
||||
src: url("../fonts/soehne-kraftig-kursiv.woff2") format("woff2")
|
||||
}
|
||||
|
||||
@font-face {
|
||||
font-display: swap;
|
||||
font-family: Söhne;
|
||||
font-style: normal;
|
||||
font-weight: 600;
|
||||
src: url("../fonts/soehne-halbfett.woff2") format("woff2")
|
||||
}
|
||||
|
||||
@font-face {
|
||||
font-display: swap;
|
||||
font-family: Söhne;
|
||||
font-style: italic;
|
||||
font-weight: 600;
|
||||
src: url("../fonts/soehne-halbfett-kursiv.woff2") format("woff2")
|
||||
}
|
||||
|
||||
@font-face {
|
||||
font-display: swap;
|
||||
font-family: Söhne Mono;
|
||||
font-style: normal;
|
||||
font-weight: 400;
|
||||
src: url("../fonts/soehne-mono-buch.woff2") format("woff2")
|
||||
}
|
||||
|
||||
@font-face {
|
||||
font-display: swap;
|
||||
font-family: Söhne Mono;
|
||||
font-style: normal;
|
||||
font-weight: 700;
|
||||
src: url("../fonts/soehne-mono-halbfett.woff2") format("woff2")
|
||||
}
|
||||
|
||||
@font-face {
|
||||
font-display: swap;
|
||||
font-family: Söhne Mono;
|
||||
font-style: italic;
|
||||
font-weight: 400;
|
||||
src: url("../fonts/soehne-mono-buch-kursiv.woff2") format("woff2")
|
||||
}
|
||||
|
||||
/* * {
|
||||
box-sizing: border-box;
|
||||
outline: 1px solid limegreen !important;
|
||||
|
||||
@@ -15,6 +15,20 @@ const cleanupPreset = ({ preset: _preset, endpointsConfig = {} }) => {
|
||||
frequency_penalty: _preset?.frequency_penalty ?? 0,
|
||||
title: _preset?.title ?? 'New Preset'
|
||||
};
|
||||
} else if (endpoint === 'google') {
|
||||
preset = {
|
||||
endpoint,
|
||||
presetId: _preset?.presetId ?? null,
|
||||
model: _preset?.model ?? endpointsConfig[endpoint]?.availableModels?.[0] ?? 'chat-bison',
|
||||
modelLabel: _preset?.modelLabel ?? null,
|
||||
examples: _preset?.examples ?? [{ input: { content: '' }, output: { content: '' } }],
|
||||
promptPrefix: _preset?.promptPrefix ?? null,
|
||||
temperature: _preset?.temperature ?? 0.2,
|
||||
maxOutputTokens: _preset?.maxOutputTokens ?? 1024,
|
||||
topP: _preset?.topP ?? 0.95,
|
||||
topK: _preset?.topK ?? 40,
|
||||
title: _preset?.title ?? 'New Preset'
|
||||
};
|
||||
} else if (endpoint === 'bingAI') {
|
||||
preset = {
|
||||
endpoint,
|
||||
|
||||
@@ -22,6 +22,23 @@ const buildDefaultConversation = ({
|
||||
presence_penalty: lastConversationSetup?.presence_penalty ?? 0,
|
||||
frequency_penalty: lastConversationSetup?.frequency_penalty ?? 0
|
||||
};
|
||||
} else if (endpoint === 'google') {
|
||||
conversation = {
|
||||
...conversation,
|
||||
endpoint,
|
||||
model:
|
||||
lastConversationSetup?.model ??
|
||||
lastSelectedModel[endpoint] ??
|
||||
endpointsConfig[endpoint]?.availableModels?.[0] ??
|
||||
'chat-bison',
|
||||
modelLabel: lastConversationSetup?.modelLabel ?? null,
|
||||
promptPrefix: lastConversationSetup?.promptPrefix ?? null,
|
||||
examples: lastConversationSetup?.examples ?? [{ input: { content: '' }, output: { content: '' }}],
|
||||
temperature: lastConversationSetup?.temperature ?? 0.2,
|
||||
maxOutputTokens: lastConversationSetup?.maxOutputTokens ?? 1024,
|
||||
topP: lastConversationSetup?.topP ?? 0.95,
|
||||
topK: lastConversationSetup?.topK ?? 40,
|
||||
};
|
||||
} else if (endpoint === 'bingAI') {
|
||||
conversation = {
|
||||
...conversation,
|
||||
@@ -108,7 +125,7 @@ const getDefaultConversation = ({ conversation, prevConversation, endpointsConfi
|
||||
|
||||
// if anything happens, reset to default model
|
||||
|
||||
const endpoint = ['openAI', 'azureOpenAI', 'bingAI', 'chatGPTBrowser'].find(e => endpointsConfig?.[e]);
|
||||
const endpoint = ['openAI', 'azureOpenAI', 'bingAI', 'chatGPTBrowser', 'google'].find(e => endpointsConfig?.[e]);
|
||||
if (endpoint) {
|
||||
conversation = buildDefaultConversation({ conversation, endpoint, endpointsConfig });
|
||||
return conversation;
|
||||
|
||||
@@ -3,25 +3,28 @@ import React from 'react';
|
||||
import { twMerge } from 'tailwind-merge';
|
||||
import GPTIcon from '../components/svg/GPTIcon';
|
||||
import BingIcon from '../components/svg/BingIcon';
|
||||
import { useAuthContext } from '~/hooks/AuthContext';
|
||||
|
||||
const getIcon = props => {
|
||||
// { size = 30, isCreatedByUser, model, chatGptLabel, error, ...props }
|
||||
const { size = 30, isCreatedByUser, button, model } = props;
|
||||
const { user, logout } = useAuthContext();
|
||||
|
||||
if (isCreatedByUser)
|
||||
return (
|
||||
<div
|
||||
title="User"
|
||||
title={user.name}
|
||||
style={{
|
||||
background: 'radial-gradient(circle at 90% 110%, rgb(1 43 128), rgb(17, 139, 161))',
|
||||
color: 'white',
|
||||
fontSize: 12,
|
||||
width: size,
|
||||
height: size
|
||||
}}
|
||||
className={`relative flex items-center justify-center rounded-sm text-white ` + props?.className}
|
||||
className={`relative flex items-center justify-center` + props?.className}
|
||||
>
|
||||
User
|
||||
<img
|
||||
className="rounded-sm"
|
||||
src={user?.avatar || `https://api.dicebear.com/6.x/initials/svg?seed=${user?.name}&fontFamily=Verdana&fontSize=36`}
|
||||
alt="avatar"
|
||||
/>
|
||||
</div>
|
||||
);
|
||||
else if (!isCreatedByUser) {
|
||||
@@ -30,27 +33,28 @@ const getIcon = props => {
|
||||
let icon, bg, name;
|
||||
if (endpoint === 'azureOpenAI') {
|
||||
const { chatGptLabel } = props;
|
||||
|
||||
icon = <GPTIcon size={size * 0.7} />;
|
||||
bg = 'linear-gradient(0.375turn, #61bde2, #4389d0)';
|
||||
name = chatGptLabel || 'ChatGPT';
|
||||
} else if (endpoint === 'openAI') {
|
||||
const { chatGptLabel } = props;
|
||||
|
||||
icon = <GPTIcon size={size * 0.7} />;
|
||||
bg = model && model.toLowerCase() === 'gpt-4' ? 'black' : (chatGptLabel
|
||||
bg = model && model.toLowerCase().startsWith('gpt-4') ? '#AB68FF' : (chatGptLabel
|
||||
? `rgba(16, 163, 127, ${button ? 0.75 : 1})`
|
||||
: `rgba(16, 163, 127, ${button ? 0.75 : 1})`);
|
||||
name = chatGptLabel || 'ChatGPT';
|
||||
} else if (endpoint === 'google') {
|
||||
const { modelLabel } = props;
|
||||
icon = <img src='/assets/palm.png' />;
|
||||
name = modelLabel || 'PaLM2';
|
||||
} else if (endpoint === 'bingAI') {
|
||||
const { jailbreak } = props;
|
||||
|
||||
icon = <BingIcon size={size * 0.7} />;
|
||||
bg = jailbreak ? `radial-gradient(circle at 90% 110%, #F0F0FA, #D0E0F9)` : `transparent`;
|
||||
name = jailbreak ? 'Sydney' : 'BingAI';
|
||||
} else if (endpoint === 'chatGPTBrowser') {
|
||||
icon = <GPTIcon size={size * 0.7} />;
|
||||
bg = model && model.toLowerCase() === 'gpt-4' ? 'black' : `rgba(0, 163, 255, ${button ? 0.75 : 1})`;
|
||||
bg = model && model.toLowerCase().startsWith('gpt-4') ? '#AB68FF' : `rgba(0, 163, 255, ${button ? 0.75 : 1})`;
|
||||
name = 'ChatGPT';
|
||||
} else if (endpoint === null) {
|
||||
icon = <GPTIcon size={size * 0.7} />;
|
||||
|
||||
@@ -40,6 +40,21 @@ const useMessageHandler = () => {
|
||||
frequency_penalty: currentConversation?.frequency_penalty ?? 0
|
||||
};
|
||||
responseSender = endpointOption.chatGptLabel ?? 'ChatGPT';
|
||||
} else if (endpoint === 'google') {
|
||||
endpointOption = {
|
||||
endpoint,
|
||||
model:
|
||||
currentConversation?.model ?? endpointsConfig[endpoint]?.availableModels?.[0] ?? 'chat-bison',
|
||||
chatGptLabel: currentConversation?.chatGptLabel ?? null,
|
||||
promptPrefix: currentConversation?.promptPrefix ?? null,
|
||||
examples: currentConversation?.examples ?? [{ input: { content: '' }, output: { content: '' }}],
|
||||
temperature: currentConversation?.temperature ?? 0.2,
|
||||
maxOutputTokens: currentConversation?.maxOutputTokens ?? 1024,
|
||||
topP: currentConversation?.topP ?? 0.95,
|
||||
topK: currentConversation?.topK ?? 40,
|
||||
token: endpointsConfig[endpoint]?.userProvide ? getToken() : null
|
||||
};
|
||||
responseSender = endpointOption.chatGptLabel ?? 'ChatGPT';
|
||||
} else if (endpoint === 'bingAI') {
|
||||
endpointOption = {
|
||||
endpoint,
|
||||
@@ -125,7 +140,7 @@ const useMessageHandler = () => {
|
||||
initialResponse
|
||||
};
|
||||
|
||||
console.log('User Input:', text);
|
||||
console.log('User Input:', text, submission);
|
||||
|
||||
if (isRegenerate) {
|
||||
setMessages([...currentMessages, initialResponse]);
|
||||
|
||||
@@ -9,10 +9,11 @@ module.exports = {
|
||||
// colors: {
|
||||
// 'gpt-dark-gray': '#343541',
|
||||
// },
|
||||
fontFamily: {
|
||||
sans: ['Söhne', 'sans-serif'],
|
||||
mono: ['Söhne Mono', 'monospace'],
|
||||
},
|
||||
extend: {
|
||||
// fontFamily: {
|
||||
// sans: ['var(--font-sans)', ...fontFamily.sans]
|
||||
// },
|
||||
keyframes: {
|
||||
'accordion-down': {
|
||||
from: { height: 0 },
|
||||
@@ -52,7 +53,7 @@ module.exports = {
|
||||
800: "#06373e",
|
||||
900: "#031f29",
|
||||
},
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
plugins: [
|
||||
|
||||
104
documents/contributions/coding_conventions.md
Normal file
104
documents/contributions/coding_conventions.md
Normal file
@@ -0,0 +1,104 @@
|
||||
# Coding Conventions
|
||||
|
||||
## Node.js API Server
|
||||
|
||||
### 1. General Guidelines
|
||||
|
||||
- Follow the [Airbnb JavaScript Style Guide](https://github.com/airbnb/javascript) for general JavaScript coding conventions.
|
||||
- Use "clean code" principles, such as keeping functions and modules small, adhering to the single responsibility principle, and writing expressive and readable code.
|
||||
- Use meaningful and descriptive variable and function names.
|
||||
- Prioritize code readability and maintainability over brevity.
|
||||
- Use the provided .eslintrc and .prettierrc files for consistent code formatting.
|
||||
- Use CommonJS modules (require/exports) for Node.js modules.
|
||||
- Organize and modularize the codebase using separate files for different concerns.
|
||||
|
||||
### 2. API Design
|
||||
|
||||
- Follow RESTful principles when designing APIs.
|
||||
- Use meaningful and descriptive names for routes, controllers, services, and models.
|
||||
- Use appropriate HTTP methods (GET, POST, PUT, DELETE) for each route.
|
||||
- Use proper status codes and response structures for consistent API responses (ie. 2xx for success, 4xx for bad request from client, 5xx for server error, etc.).
|
||||
- Use try-catch blocks to catch and handle exceptions gracefully.
|
||||
- Implement proper error handling and consistently return appropriate error responses.
|
||||
- Use the logging system included in the `utils` directory to log important events and errors.
|
||||
- Do JWT-based, stateless authentication using the `requireJWTAuth` middleware.
|
||||
|
||||
### 3. File Structure
|
||||
|
||||
*Note: The API is undergoing a refactor to separate out the code for improved separation of concerns, testability, and maintainability. Any new APIs must follow the structure using the auth system as an example, which separates out the routes, controllers, services, and models into separate files.*
|
||||
|
||||
#### Routes
|
||||
|
||||
Specifies each http request method, any middleware to be used, and the controller function to be called for each route.
|
||||
|
||||
- Define routes using the Express Router in separate files for each resource or logical grouping.
|
||||
- Use descriptive route names and adhere to RESTful conventions.
|
||||
- Keep routes concise and focused on a single responsibility.
|
||||
- Prefix all routes with the /api namespace.
|
||||
|
||||
#### Controllers
|
||||
|
||||
Contains the logic for each route, including calling the appropriate service functions and returning the appropriate response status code and JSON body.
|
||||
|
||||
- Create a separate controller file for each route to handle the request/response logic.
|
||||
- Name controller files using the PascalCase convention and append "Controller" to the file name (e.g., UserController.js).
|
||||
- Use controller methods to encapsulate logic related to the route handling.
|
||||
- eep controllers thin by delegating complex operations to service or model files.
|
||||
|
||||
#### Services
|
||||
|
||||
Contains complex business logic or operations shared across multiple controllers.
|
||||
|
||||
- Name service files using the PascalCase convention and append "Service" to the file name (e.g., AuthService.js).
|
||||
- Avoid tightly coupling services to specific models or databases for better reusability.
|
||||
- Maintain a single responsibility principle within each service.
|
||||
|
||||
#### Models
|
||||
|
||||
Defines Mongoose models to represent data entities and their relationships.
|
||||
|
||||
- Use singular, PascalCase names for model files and their associated collections (e.g., User.js and users collection).
|
||||
- Include only the necessary fields, indexes, and validations in the models.
|
||||
- Keep models independent of the API layer by avoiding direct references to request/response objects.
|
||||
|
||||
### 4. Database Access (MongoDB and Mongoose)
|
||||
|
||||
- Use Mongoose (https://mongoosejs.com) as the MongoDB ODM.
|
||||
- Create separate model files for each entity and ensure clear separation of concerns.
|
||||
- Use Mongoose schema validation to enforce data integrity.
|
||||
- Handle database connections efficiently and avoid connection leaks.
|
||||
- Use Mongoose query builders to create concise and readable database queries.
|
||||
|
||||
### 5. Testing and Documentation
|
||||
|
||||
*Note: the repo currently lacks sufficient automated unit and integration tests for both the client and the API. This is a great first issue for new contributors wanting to familiarize with the codebase.*
|
||||
|
||||
- Write unit tests for all critical and complex functionalities using Jest.
|
||||
- Write integration tests for all API endpoints using Supertest.
|
||||
- Write end-to-end tests for all client-side functionalities using Playwright.
|
||||
- Use descriptive test case and function names to clearly express the test's purpose.
|
||||
- Document the code using JSDoc comments to provide clear explanations of functions, parameters, and return types. (WIP)
|
||||
|
||||
|
||||
## React Client
|
||||
|
||||
### General TypeScript and React Best Practices
|
||||
|
||||
- Use [TypeScript best practices](https://onesignal.com/blog/effective-typescript-for-react-applications/) to benefit from static typing and improved tooling.
|
||||
- Group related files together within folders.
|
||||
- Name components using the PascalCase convention.
|
||||
- Use concise and descriptive names that accurately reflect the component's purpose.
|
||||
- Split complex components into smaller, reusable ones when appropriate.
|
||||
- Keep the rendering logic within components minimal.
|
||||
- Extract reusable parts into separate functions or hooks.
|
||||
- Apply prop type definitions using TypeScript types or interfaces.
|
||||
- Use form validation where appropriate. (note: we use [React Hook Form](https://react-hook-form.com/) for form validation and submission)
|
||||
|
||||
### Data Services
|
||||
|
||||
Use the conventions found in the `data-provider` directory for handling data services. For more information, see [this article](https://www.danorlandoblog.com/chatgpt-clone-data-services-with-react-query/) which describes the methodology used.
|
||||
|
||||
### State Management
|
||||
|
||||
Use [Recoil](https://recoiljs.org/) for state management, but *DO NOT pollute the global state with unnecessary data*. Instead, use local state or props for data that is only used within a component or passed down from parent to child.
|
||||
|
||||
@@ -5,11 +5,19 @@ such as bug reports, documentation improvements, feature requests, and code cont
|
||||
|
||||
## Contributing Guidelines
|
||||
|
||||
When contributing to this repository, please first discuss the change you wish to make via [issue](https://github.com/danny-avila/chatgpt-clone/issues) or
|
||||
join our discord [Discord community](https://discord.gg/NGaa9RPCft).
|
||||
If the feature you would like to contribute has not already received prior approval from the project maintainers (ie. the feature is currently on the roadmap or on the [trello board]()), please submit a proposal in the [proposals category](https://github.com/danny-avila/chatgpt-clone/discussions/categories/proposals) of the discussions board before beginning work on it.
|
||||
- Proposals should include specific implementation details including areas of the application that will be effected by the change inlcuding designs if applicable, and any other relevant information that might be required for a speedy review.
|
||||
- Proposals are not required for small changes, bug fixes, or documentation improvements.
|
||||
- Small changes and bug fixes should be tied to an [issue](https://github.com/danny-avila/chatgpt-clone/issues) and included in the corresponding pull request for tracking purposes.
|
||||
|
||||
*Please note that a pull request involving a feature that has not been reviewed and approved by the project maintainers may be rejected.*
|
||||
|
||||
If you would like to discuss the changes you wish to make, join our [Discord community](https://discord.gg/NGaa9RPCft).
|
||||
|
||||
## Our Standards
|
||||
|
||||
Please read our [Coding Standards and Conventions](coding_conventions.md) before beginning on a contribution.
|
||||
|
||||
Examples of behavior that contributes to creating a positive environment
|
||||
include:
|
||||
|
||||
|
||||
36
documents/general_info/multilingual_information.md
Normal file
36
documents/general_info/multilingual_information.md
Normal file
@@ -0,0 +1,36 @@
|
||||
# Multilingual Information
|
||||
To set up the project, please follow the instructions in the documentation. The documentation is in English only, so you may need to use a translation tool or an AI assistant (e.g. ChatGPT) if you have difficulty understanding it.
|
||||
#
|
||||
Para configurar el proyecto, por favor siga las instrucciones en la documentación. La documentación está en inglés solamente, así que quizá necesite utilizar una herramienta de traducción o un asistente de inteligencia artificial (por ejemplo, ChatGPT) si tiene dificultades para entenderla.
|
||||
#
|
||||
要设置该项目,请按照文档中的说明进行操作。文档仅以英语为语言,如果您有困难理解,请使用翻译工具或人工智能助手(例如 ChatGPT)。
|
||||
#
|
||||
परियोजना सेटअप करने के लिए, कृपया दस्तावेज़ीकरण में दिए गए निर्देशों का पालन करें। दस्तावेज़ीकरण केवल अंग्रेज़ी में है, इसलिए आपको इसे समझने में कठिनाई होती हो तो आप अनुवाद उपकरण या एक एआई सहायक (जैसे कि ChatGPT) का उपयोग कर सकते हैं।
|
||||
#
|
||||
لإعداد المشروع، يرجى اتباع التعليمات الموجودة في الوثائق. الوثائق باللغة الإنجليزية فقط، لذلك قد تحتاج إلى استخدام أداة ترجمة أو مساعدة الذكاء الاصطناعي (على سبيل المثال، ChatGPT) إذا كنت معنويًا صعوبة في فهمها.
|
||||
#
|
||||
Para configurar o projeto, siga as instruções na documentação. Esta documentação está disponível apenas em inglês, portanto, se tiver dificuldades em compreendê-la, pode ser necessário usar uma ferramenta de tradução ou um assistente de inteligência artificial (como o ChatGPT).
|
||||
#
|
||||
Для настройки проекта, пожалуйста, следуйте инструкциям, приведенным в документации. Документация доступна только на английском языке, поэтому, если у вас возникнут затруднения в понимании, вам может потребоваться использовать инструмент перевода или искусственный интеллект (например, ChatGPT).
|
||||
#
|
||||
設置專案,請跟隨文件中的說明進行。文件只提供英文,因此如果您對理解有困難,可能需要使用翻譯工具或 AI 助理 (例如 ChatGPT)。
|
||||
#
|
||||
Pour installer projet, veuillez suivre les instructions de la documentation. La documentation est disponible uniquement en anglais, donc si vous avez des difficultés à la comprendre, il peut être nécessaire d’utiliser un outil de traduction ou un assistant d’intelligence artificielle (comme ChatGPT).
|
||||
#
|
||||
Um das Projekt einzurichten, befolgen Sie bitte die Anweisungen in der Dokumentation. Die Dokumentation ist nur auf Englisch verfügbar, so dass es bei Schwierigkeiten beim Verständnis möglicherweise notwendig ist, eine Übersetzungshilfe oder einen AI-Assistenten (wie ChatGPT) zu verwenden.
|
||||
#
|
||||
プロジェクトをセットアップするには、ドキュメンテーションに記載された手順に従ってください。ドキュメンテーションは現在英語のみとなっている為、理解が難しい場合は翻訳ツールやAIアシスタント(ChatGPTなど)の翻訳機能の利用をお勧めします。
|
||||
#
|
||||
프로젝트를 셋업하려면 문서에 기재된 지시사항을 따라 진행해주세요. 현재 문서는 영어로만 제공되므로 이해하는 데 어려움이 있다면 번역 도구 또는 AI 어시스턴트(예: ChatGPT)를 사용하는것을 권장합니다.
|
||||
#
|
||||
Per impostare il progetto, seguire le istruzioni presenti nella documentazione. La documentazione è disponibile solo in inglese, quindi, se avete difficoltà a comprenderla, può essere necessario utilizzare uno strumento di traduzione o un assistente AI (ad esempio, ChatGPT).
|
||||
#
|
||||
Om het project op te zetten, volg de instructies in de documentatie. De documentatie is alleen beschikbaar in het Engels, dus als u moeite hebt om deze te begrijpen, kan het nodig zijn om een vertaalmiddel of een AI-assistent (zoals ChatGPT) te gebruiken.
|
||||
#
|
||||
A projekt beállításához kövesse a használati útmutatót. Az útmutató csak angolul érhető el, így ha nehézséget okoz a megértése, szükség lehet fordító eszközre vagy AI-asszisztensre (pl. ChatGPT).
|
||||
#
|
||||
Aby skonfigurować projekt, należy postępować zgodnie z instrukcjami zawartymi w dokumentacji. Dokumentacja jest dostępna tylko w języku angielskim, więc w razie trudności w zrozumieniu, może być konieczne użycie narzędzia do tłumaczenia lub asystenta AI (np. ChatGPT).
|
||||
##
|
||||
|
||||
## [Go Back to ReadMe](../../README.md)
|
||||
|
||||
@@ -1,4 +1,6 @@
|
||||
# Linux Installation
|
||||
Thanks to @DavidDev1334 !
|
||||
##
|
||||
|
||||
## Prerequisites
|
||||
|
||||
@@ -86,15 +88,15 @@ You will need all your credentials, (API keys, access tokens, and MongoDB Connec
|
||||
|
||||
## Run the project
|
||||
|
||||
### Using the command line
|
||||
### Using the command line (in the root directory)
|
||||
Setup the app:
|
||||
1. Run `npm ci`
|
||||
2. Run `npm run frontend`
|
||||
|
||||
1. Run `npm ci` in the "/home/user/chatgpt-clone/api" directory
|
||||
2. Run `npm ci` in the "/home/user/chatgpt-clone/client" directory
|
||||
3. Run `npm run build` in the "/home/user/chatgpt-clone/client"
|
||||
4. Run `meilisearch --master-key put_your_meilesearch_Master_Key_here` in the "/home/user/chat
|
||||
5. Run "meilisearch --master-key put_your_meilesearch_Master_Key_here" in the "/home/user/chatgpt-clone" directory (Only if SEARCH=TRUE)
|
||||
6. Run npm start in the "/home/user/chatgpt-clone/api" directory
|
||||
7. Visit http://localhost:3080 (default port) & enjoy
|
||||
Start the app:
|
||||
1. Run `npm run backend`
|
||||
2. Run `meilisearch --master-key put_your_meilesearch_Master_Key_here` (Only if SEARCH=TRUE)
|
||||
3. Visit http://localhost:3080 (default port) & enjoy
|
||||
|
||||
### Using a shell script
|
||||
|
||||
@@ -112,13 +114,13 @@ You will need all your credentials, (API keys, access tokens, and MongoDB Connec
|
||||
gnome-terminal --tab --title="MeiliSearch" --command="bash -c 'meilisearch --master-key your_master_key_goes_here'"
|
||||
# ↑↑↑ meilisearch is the name of the meilisearch executable, put your own master key there
|
||||
|
||||
gnome-terminal --tab --title="ChatGPT-Clone" --working-directory=/home/user/chatgpt-clone/api --command="bash -c 'npm start'"
|
||||
gnome-terminal --tab --title="ChatGPT-Clone" --working-directory=/home/user/chatgpt-clone/ --command="bash -c 'npm run backend'"
|
||||
# this shell script goes at the root of the chatgpt-clone directory (/home/user/chatgpt-clone/)
|
||||
```
|
||||
|
||||
## Update the app version
|
||||
|
||||
If you update the chatgpt-clone project files, manually redo the npm ci and npm run build steps.
|
||||
If you update the chatgpt-clone project files, manually redo the npm ci and npm run frontend steps.
|
||||
|
||||
##
|
||||
|
||||
|
||||
@@ -58,14 +58,8 @@ Follow the instructions for setting up proxies, access tokens, and user system:
|
||||
|
||||
|
||||
- Create a .env file in the api directory by running cp api/.env.example api/.env and edit the file with your preferred text editor, adding the required API keys, access tokens, and MongoDB connection string
|
||||
- Run npm ci in both the api and client directories by running:
|
||||
|
||||
```
|
||||
cd api && npm ci && cd ..
|
||||
cd client && npm ci && cd ..
|
||||
```
|
||||
|
||||
- Build the client by running cd client && npm run build && cd ..
|
||||
- Run npm ci root directory `npm ci`
|
||||
- Build the client by running `npm run frontend`
|
||||
|
||||
**Download MeiliSearch for macOS:**
|
||||
- You can download the latest MeiliSearch binary for macOS from their GitHub releases page: https://github.com/meilisearch/MeiliSearch/releases. Look for the file named meilisearch-macos-amd64 (or the equivalent for your system architecture) and download it.
|
||||
@@ -108,7 +102,7 @@ Visit http://localhost:3080 (default port) & enjoy
|
||||
if [ -x "$(command -v ./meilisearch)" ]; then
|
||||
./meilisearch --master-key your_master_key_goes_here &
|
||||
fi
|
||||
cd api && npm start
|
||||
npm run backend
|
||||
```
|
||||
|
||||
**Make the script executable by running**
|
||||
|
||||
@@ -66,15 +66,15 @@ You will need all your credentials, (API keys, access tokens, and Mongo Connecti
|
||||
|
||||
### Run the app
|
||||
|
||||
#### Using the command line
|
||||
### Using the command line (in the root directory)
|
||||
To setup the app:
|
||||
1. Run `npm ci`
|
||||
2. Run `npm run frontend`
|
||||
|
||||
- **Run** `npm ci` in the "C:/chatgpt-clone/api" directory
|
||||
- **Run** `npm ci` in the "C:/chatgpt-clone/client" directory
|
||||
- **Run** `npm run build` in the "C:/chatgpt-clone/client"
|
||||
- **Run** `"meilisearch --master-key put_your_meilesearch_Master_Key_here"` in the "C:/chatgpt-clone" directory (Only if SEARCH=TRUE)
|
||||
- **Run** `npm start` in the "C:/chatgpt-clone/api" directory
|
||||
|
||||
- **Visit** http://localhost:3080 (default port) & enjoy
|
||||
To use the app:
|
||||
1. Run `npm run backend`
|
||||
2. Run `meilisearch --master-key put_your_meilesearch_Master_Key_here` (Only if SEARCH=TRUE)
|
||||
3. Visit http://localhost:3080 (default port) & enjoy
|
||||
|
||||
#### Using a batch file
|
||||
|
||||
@@ -92,14 +92,14 @@ start "MeiliSearch" cmd /k "meilisearch --master-key your_master_key_goes_here
|
||||
|
||||
REM ↑↑↑ meilisearch is the name of the meilisearch executable, put your own master key there
|
||||
|
||||
start "ChatGPT-Clone" cmd /k "cd api && npm start"
|
||||
start "ChatGPT-Clone" cmd /k "npm run backend"
|
||||
|
||||
REM this batch file goes at the root of the chatgpt-clone directory (C:/chatgpt-clone/)
|
||||
```
|
||||
|
||||
### Update the app version
|
||||
|
||||
If you update the chatgpt-clone project files, mannually redo the `npm ci` and `npm run build` steps
|
||||
If you update the chatgpt-clone project files, mannually redo the `npm ci` and `npm run frontend` steps
|
||||
|
||||
##
|
||||
|
||||
|
||||
27147
package-lock.json
generated
27147
package-lock.json
generated
File diff suppressed because it is too large
Load Diff
10
package.json
10
package.json
@@ -1,8 +1,16 @@
|
||||
{
|
||||
"name": "chatgpt-clone",
|
||||
"version": "0.4.1",
|
||||
"version": "0.4.5",
|
||||
"description": "",
|
||||
"workspaces": [
|
||||
"api",
|
||||
"client"
|
||||
],
|
||||
"scripts": {
|
||||
"backend": "cd api && npm run start",
|
||||
"backend-dev": "cd api && npm run server-dev",
|
||||
"frontend": "cd client && npm run build",
|
||||
"frontend-dev": "cd client && npm run dev",
|
||||
"e2e": "playwright test --config=e2e/playwright.config.js",
|
||||
"e2e:update": "playwright test --config=e2e/playwright.config.js --update-snapshots",
|
||||
"e2e:debug": "cross-env PWDEBUG=1 playwright test --config=e2e/playwright.config.js",
|
||||
|
||||
Reference in New Issue
Block a user