Compare commits

...

23 Commits

Author SHA1 Message Date
Danny Avila
501a15a18f Release 0.4.4 (#271) 2023-05-14 20:39:40 -04:00
Danny Avila
6049c9e3ff Fix react errors, max context tokens, and preset mobile view (#269)
* fix: react errors

* fix: max tokens issue

* fix: max tokens issue
2023-05-14 17:26:21 -04:00
Pawan Kumar
262b402606 fix code to adjust max_tokens according to model selection (#263) 2023-05-14 12:16:38 -04:00
Danny Avila
56ea9563b8 refactor(style.css): change font file paths (#268) 2023-05-14 12:12:56 -04:00
Anirudh
2cd6612620 Fonts (#261) 2023-05-14 12:06:53 -04:00
Danny Avila
5d40396fb2 refactor(Conversation.js): change default pageSize from 12 to 14 in getConvosByPage and getConvosQueried functions. Remove unnecessary parentheses and curly braces in getConvosQueried function. Remove unnecessary parentheses in deleteConvos function. (#267) 2023-05-14 11:45:18 -04:00
Anirudh
93dd1eb036 Add Popup Menu to Save Space in Sidebar (#260)
---------

Co-authored-by: Danny Avila <110412045+danny-avila@users.noreply.github.com>
2023-05-14 11:42:17 -04:00
Anton Volnuhin
542a46dc7c Correct the typo in auth.json for accessing Google Palm (#266)
Co-authored-by: Anton Volnuhin <anton@volnuhin.ru>
2023-05-14 11:25:22 -04:00
Anirudh
bf31b1fea0 Msg Clipboard to checkmark (optimistic UX) (#247)
* revert unintended package-lock.json change

* used default checkmark which is included in project

---------

Co-authored-by: Danny Avila <110412045+danny-avila@users.noreply.github.com>
2023-05-14 09:00:20 -04:00
Danny Avila
25d4529ff9 Release v0.4.3 2023-05-13 17:10:19 -04:00
Danny Avila
33d7c67c04 Release v0.4.3 2023-05-13 17:09:25 -04:00
Danny Avila
dc8f762bac Release v0.4.3 2023-05-13 17:08:28 -04:00
Danny Avila
49041e16c7 chore: bump package versions to 0.4.3 (#265) 2023-05-13 16:59:45 -04:00
Danny Avila
3414690e42 Feat: PaLM 2 (#262)
* feat(api): add googleapis package to package.json
feat(api): add reqDemo.js file to make a request to Google Cloud AI Platform API to get a response from a chatbot model.

* feat: add PaLM2 support

* feat(conversationPreset.js): add support for topP and topK for google endpoint
feat(askGoogle.js): add support for topP and topK for google endpoint
feat(ask/index.js): add google endpoint
feat(endpoints.js): add google endpoint
feat(MessageHeader.jsx): add support for modelLabel for google endpoint
feat(PresetItem.jsx): add support for modelLabel for google endpoint
feat(HoverButtons.jsx): add support for google endpoint
feat(createPayload.ts): add google endpoint
feat(types.ts): add google endpoint
feat(store/endpoints.js): add google endpoint
feat(cleanupPreset.js): add support for topP and topK for google endpoint
feat(getDefaultConversation.js): add support for topP and topK for google endpoint
feat(handleSubmit.js): add support for topP and topK for google endpoint

* fix: messages payload

* refactor(GoogleClient.js): set maxContextTokens based on isTextModel value
feat(GoogleClient.js): add delay option to TextStream constructor
feat(getIcon.jsx): add support for google endpoint and PaLM2 model label

* feat: palm frontend changes

* feat(askGoogle.js): set default example to empty input and output
feat(Examples.jsx): add ability to add and remove examples
refactor(Settings.jsx): remove examples from props and setOption function

style(GoogleOptions): remove unnecessary whitespace after Settings2 import
feat(GoogleOptions): add addExample and removeExample functions to manage examples
fix(cleanupPreset): set default example to [{ input: '', output: ''}]
fix(getDefaultConversation): set default example to [{ input: '', output: ''}]
fix(handleSubmit): set default example to [{ input: '', output: ''}]

* style(client): adjust height of settings and examples components to 350px
fix(client): fix path to palm.png image in getIcon.jsx file

* style(EndpointOptionsPopover.jsx, Examples.jsx, Settings.jsx): improve button styles and update input placeholders

* feat (palm): finalize examples on the frontend

* feat(GoogleClient.js): filter out empty examples in options
feat(GoogleClient.js): add support for promptPrefix in buildPayload method
feat(GoogleClient.js): add support for examples in buildPayload method
feat(conversationPreset.js): add maxOutputTokens field to conversation preset schema
feat(presetSchema.js): add examples field to preset schema
feat(askGoogle.js): add support for examples and promptPrefix in endpointOption
feat(EditPresetDialog.jsx): add Examples component for Google endpoint
feat(EditPresetDialog.jsx): add button to show/hide Examples component
feat(EditPresetDialog.jsx): add functionality to add, remove, and edit examples in Examples component
feat(EndpointOptionsDialog.jsx): change endpoint name to PaLM for Google endpoint
feat(Settings.jsx): add maxHeight prop to limit height of Settings component in EditPresetDialog and EndpointOptionsDialog

fix(Settings.jsx): add examples prop to ChatGPTBrowser component
fix(EndpointItem.jsx): add alternate name for google endpoint
fix(MessageHeader.jsx): change title for google endpoint to PaLM
feat(endpoints.js): add google endpoint to endpointsConfig
fix(cleanupPreset.js): add missing comma in examples array

* chore: change endpoint order

* feat(PaLM 2): complete for testing

* fix(PaLM): handle blocked messages
2023-05-13 16:29:06 -04:00
LaraClara
95c97561ae chore: NPM Workspaces and scripts (#244)
* chore: NPM Workspaces and scripts
- Allows everything to be run in the root directory

* chore:Update package-lock after workspace change

* docs: Minor docs typo fix
- most people run in dev mode, ie vite runs the server, this defaults to that method
2023-05-12 09:40:14 -04:00
Danny Avila
8bb4d7d590 Release 0.4.2 2023-05-11 16:46:27 -04:00
Danny Avila
94ad31dce3 Release 0.4.2 (#242)
* release: 0.4.2

* docs: update changelog and readme for v0.4.2 release

* docs(README.md): add important information about new user system and env variables
2023-05-11 16:39:44 -04:00
Danny Avila
91e9b167b3 fix(docker): update .dockerignore to include client/.env file (#241)
fix(docker): add COPY command to copy client/.env file into the container before building
2023-05-11 15:59:43 -04:00
Dan Orlando
53ea3dd9fb Feature/logging system with pino and sanitization (#214) (#227)
* feat: Create structured data logging system with Pino

This commit creates a new feature that enables structured data logging using the Pino logging library. The structured data logging feature allows for more granular and customizable logging, making it easier to analyze and debug issues in the application.

The changes made in this commit include:
- Adding support for structured data logging using the Pino API
- Adding support to redact sensible data from logging output using pino
  redact.
- Pino integrate natively with fluentd, logstash, Docker Logging Drivers
  and other JSON based system.

* Add pino package to project

* Logging-System: Add support for an array of regex to redact

* Logging-Systems: Add Redact Patterns and Pino Redact Paths + Boolean logics wasn't right.

Co-authored-by: Olivier Contant <ocontant@users.noreply.github.com>
2023-05-10 23:59:26 -04:00
Danny Avila
c72a3a0362 refactor(titleConvo.js, endpoints.js): add support for AZURE_OPENAI_API_KEY environment variable (#235) 2023-05-10 23:56:24 -04:00
Danny Avila
bd068c9a5a feat(chatgpt-client.js, titleConvo.js, genAzureEndpoints.js): add support for Azure OpenAI API endpoint generation (#234)
This commit adds support for generating Azure OpenAI API endpoints in the
`chatgpt-client.js` and `titleConvo.js` files. The `genAzureEndpoint` function
in `genAzureEndpoints.js` generates the endpoint URL based on the provided
parameters. The `chatgpt-client.js` and `titleConvo.js` files now use this
function to generate the endpoint URL when the `AZURE_OPENAI_API_KEY` environment
variable is set.
2023-05-10 23:47:26 -04:00
Youth
7997c3137a * refactor(getCitations.js): add null check for adaptiveCards variable before accessing its properties (#232) 2023-05-10 21:23:15 -04:00
Fuegovic
b466b36e7a Update README.md to v0.4.1 (#224)
* Update CONTRIBUTORS.md

* Update CHANGELOG.md

v0.4.1 Changelog

v0.4.1 changelog and link
Contributors in the TOC

* Update README.md

add a link to the alternative docs from @DavidDev1334

* Update linux_install.md

Credit @DavidDev1334 for linux install doc
2023-05-09 22:12:12 -04:00
82 changed files with 29806 additions and 221 deletions

View File

@@ -1,2 +1,2 @@
**/node_modules
**/.env
api/.env

1
.gitignore vendored
View File

@@ -59,6 +59,7 @@ src/style - official.css
/playwright/.cache/
.DS_Store
*.code-workspace
.idea
# meilisearch
meilisearch

View File

@@ -1,5 +1,93 @@
# # Changelog
<details open>
<summary><strong>2023-05-14</strong></summary>
**Released [v0.4.4](https://github.com/danny-avila/chatgpt-clone/releases/tag/v0.4.4):**
1. The Msg Clipboard was changed to a checkmark for improved user experience by @techwithanirudh in PR [#247](https://github.com/danny-avila/chatgpt-clone/pull/247).
2. A typo in the auth.json path for accessing Google Palm was corrected by @antonme in PR [#266](https://github.com/danny-avila/chatgpt-clone/pull/266).
3. @techwithanirudh added a Popup Menu to save sidebar space in PR [#260](https://github.com/danny-avila/chatgpt-clone/pull/260).
4. The default pageSize in Conversation.js was increased from 12 to 14 by @danny-avila in PR [#267](https://github.com/danny-avila/chatgpt-clone/pull/267).
5. Fonts were updated by @techwithanirudh in PR [#261](https://github.com/danny-avila/chatgpt-clone/pull/261).
6. Font file paths in style.css were changed by @danny-avila in PR [#268](https://github.com/danny-avila/chatgpt-clone/pull/268).
7. Code was fixed to adjust max_tokens according to model selection by @p4w4n in PR [#263](https://github.com/danny-avila/chatgpt-clone/pull/263).
8. Various improvements were made, such as fixing react errors and adjusting the mobile view, by @danny-avila in PR [#269](https://github.com/danny-avila/chatgpt-clone/pull/269).
New contributors to the project include:
- @techwithanirudh, who made their first contribution in PR [#247](https://github.com/danny-avila/chatgpt-clone/pull/247).
- @antonme, who made their first contribution in PR [#266](https://github.com/danny-avila/chatgpt-clone/pull/266).
- @p4w4n, who made their first contribution in PR [#263](https://github.com/danny-avila/chatgpt-clone/pull/263).
The [full changelog can be found here](https://github.com/danny-avila/chatgpt-clone/compare/v0.4.3...v0.4.4)
</details>
<details>
<summary><strong>2023-05-13</strong></summary>
**Released [v0.4.3](https://github.com/danny-avila/chatgpt-clone/releases/tag/v0.4.3) which now supports Google's PaLM 2!**
![image](https://github.com/danny-avila/chatgpt-clone/assets/110412045/ec5e8ff3-6c3a-4f25-9687-d8558435d094)
**How to Setup PaLM 2 (via Google Cloud Vertex AI API)**
- Enable the Vertex AI API on Google Cloud:
- - https://console.cloud.google.com/vertex-ai
- Create a Service Account:
- - https://console.cloud.google.com/projectselector/iam-admin/serviceaccounts/create?walkthrough_id=iam--create-service-account#step_index=1
- Make sure to click 'Create and Continue' to give at least the 'Vertex AI User' role.
- Create a JSON key, rename as 'auth.json' and save it in /api/data/.
**Alternatively**
- In your ./api/.env file, set PALM_KEY as "user_provided" to allow the user to provide a Service Account key JSON from the UI.
- They will follow the steps above except for renaming the file, simply importing the JSON when prompted.
- The key is sent to the server but never saved except in your local storage
**Note:**
- Vertex AI does not (yet) support response streaming for text generations, so response may seem to take long when generating a lot of text.
- Text streaming is simulated
You can check the full changelog in between v0.4.2 and v0.4.3 [here](https://github.com/danny-avila/chatgpt-clone/compare/v0.4.2...v0.4.3).
</details>
<details>
<summary><strong>2023-05-11</strong></summary>
**Released [v0.4.2](https://github.com/danny-avila/chatgpt-clone/releases/tag/v0.4.2)**
ChatGPT-Clone received some important upgrades and improvements. A new contributor, [@qcgm1978](https://github.com/qcgm1978), makes their first contribution by adding a null check for adaptiveCards variable. Additionally, support for titling conversations with the Azure endpoint is added by [@danny-avila](https://github.com/danny-avila) in PR [#234](https://github.com/danny-avila/chatgpt-clone/pull/234). In PR [#235](https://github.com/danny-avila/chatgpt-clone/pull/235), [@danny-avila](https://github.com/danny-avila) also makes some necessary fixes to titling, quotation marks, and endpoints being unavailable with only the Azure key provided. The logging system is now powered by Pino and sanitization, thanks to [@danorlando](https://github.com/danorlando) in PR [#227](https://github.com/danny-avila/chatgpt-clone/pull/227). To bulletproof the Docker container, the .dockerignore file is updated to include the client/.env file by [@danny-avila](https://github.com/danny-avila) in PR [#241](https://github.com/danny-avila/chatgpt-clone/pull/241). This issue was brought to our attention on discord.
There is active work on the new Plugins feature, converting the frontend to Typescript, and looking to integrate Palm2, google's new generative AI accessible via API, to the project as a new endpoint.
You can check the full changelog in between [v0.4.1](https://github.com/danny-avila/chatgpt-clone/releases/tag/v0.4.1) and [v0.4.2](https://github.com/danny-avila/chatgpt-clone/releases/tag/v0.4.2) [here](https://github.com/danny-avila/chatgpt-clone/compare/v0.4.1...v0.4.2)."
For discussion and suggestion you can join us: **[community discord server](https://discord.gg/NGaa9RPCft)**
</details>
<details>
<summary><strong>2023-05-09</strong></summary>
**Released [v0.4.1](https://github.com/danny-avila/chatgpt-clone/releases/tag/v0.4.1)**
* update user system section of readme by @danorlando in #207
* remove github-passport and update package.lock files by @danorlando in #208
* Update README.md by @fuegovic in #209
* fix: fix browser refresh redirecting to /chat/new by @danorlando in #210
* fix: fix issue with validation when google account has multiple spaces in username by @danorlando in #211
* chore: update docker image version to use latest by @danny-avila in #218
* update documentation structure by @fuegovic in #220
* Feat: Add Azure support by @danny-avila in #219
* Update Message.js by @DavidDev1334 in #191
⚠️ **IMPORTANT :** Since V0.4.0 You should register and login with a local account (email and password) for the first time sign-up. if you use login for the first time with a social login account (eg. Google, facebook, etc.), the conversations and presets that you created before the user system was implemented will NOT be migrated to that account.
⚠️ **Breaking - new Env Variables :** Since V0.4.0 You will need to add the new env variables from .env.example for the app to work, even if you're not using multiple users for your purposes.
For discussion and suggestion you can join us: **[community discord server](https://discord.gg/NGaa9RPCft)**
</details>
<details>
<summary><strong>2023-05-07</strong></summary>
**Released [v0.4.0](https://github.com/danny-avila/chatgpt-clone/releases/tag/v0.4.0)**, Introducing User/Auth System and OAuth2/Social Login! You can now register and login with an email account or use Google login. Your your previous conversations and presets will migrate to your new profile upon creation. Check out the details in the [User/Auth System](#userauth-system) section of the README.md.
@@ -10,11 +98,7 @@
For discussion and suggestion you can join us: **[community discord server](https://discord.gg/NGaa9RPCft)**
</details>
<details>
<summary><strong>Previous Updates</strong></summary>
<details>
<summary><strong>2023-04-05</strong></summary>
@@ -45,7 +129,7 @@ The above features are next and then I will have to focus on building the **test
On that note, I had to switch the default branch due to some breaking changes that haven't been straight forward to debug, mainly related to node-chat-gpt the main dependency of the project. Thankfully, my working branch, now switched to default as main, is working as expected.
</details>
</details>
##

View File

@@ -1,6 +1,6 @@
# Contributors List
Here is a list of all contributors to this project:
We appreciate all the contributors who helped make this project possible:
- danny-avila (Admin)
- wtlyu (Contributor)
@@ -8,11 +8,12 @@ Here is a list of all contributors to this project:
- alfredo-f (Contributor)
- HyunggyuJang (Contributor)
- fuegovic (Contributor)
- DavidDev1334
- toordog (Contributor)
- heathriel (External Contributor)
- hackreactor-bot (Contributor)
- git-bruh (Contributor)
zhangsean (Contributor)
- zhangsean (Contributor)
- llk89 (Contributor)
- adamrb (Contributor)

View File

@@ -1,6 +1,7 @@
FROM node:19-alpine AS react-client
WORKDIR /client
# copy package.json into the container at /client
COPY /client/.env /client/.env
COPY /client/package*.json /client/
# install dependencies
RUN npm ci

View File

@@ -40,18 +40,63 @@
##
<details open>
<summary><strong>2023-05-07</strong></summary>
**Released [v0.4.0](https://github.com/danny-avila/chatgpt-clone/releases/tag/v0.4.0)**, Introducing User/Auth System and OAuth2/Social Login! You can now register and login with an email account or use Google login. Your your previous conversations and presets will migrate to your new profile upon creation. Check out the details in the [User/Auth System](documents/features/user_auth_system.md) section of the README.md.
## **Google's PaLM 2 is now supported as of [v0.4.3](https://github.com/danny-avila/chatgpt-clone/releases/tag/v0.4.3)**
![image](https://github.com/danny-avila/chatgpt-clone/assets/110412045/ec5e8ff3-6c3a-4f25-9687-d8558435d094)
⚠️ **IMPORTANT :** You should register and login with a local account (email and password) for the first time sign-up. if you use login for the first time with a social login account (eg. Google, facebook, etc.), the conversations and presets that you created before the user system was implemented will NOT be migrated to that account.
<details>
<summary><strong>How to Setup PaLM 2 (via Google Cloud Vertex AI API)</strong></summary>
- Enable the Vertex AI API on Google Cloud:
- - https://console.cloud.google.com/vertex-ai
- Create a Service Account:
- - https://console.cloud.google.com/projectselector/iam-admin/serviceaccounts/create?walkthrough_id=iam--create-service-account#step_index=1
- Make sure to click 'Create and Continue' to give at least the 'Vertex AI User' role.
- Create a JSON key, rename as 'auth.json' and save it in /api/data/.
⚠️ **Breaking - new Env Variables :** You will need to add the new env variables from .env.example for the app to work, even if you're not using multiple users for your purposes.
**Alternatively**
- In your ./api/.env file, set PALM_KEY as "user_provided" to allow the user to provide a Service Account key JSON from the UI.
- They will follow the steps above except for renaming the file, simply importing the JSON when prompted.
- The key is sent to the server but never saved except in your local storage
**Note:**
- Vertex AI does not (yet) support response streaming for text generations, so response may seem to take long when generating a lot of text.
- Text streaming is simulated
</details>
---
<details open>
<summary><strong>2023-05-14</strong></summary>
**Released [v0.4.4](https://github.com/danny-avila/chatgpt-clone/releases/tag/v0.4.4):**
1. The Msg Clipboard was changed to a checkmark for improved user experience by @techwithanirudh in PR [#247](https://github.com/danny-avila/chatgpt-clone/pull/247).
2. A typo in the auth.json path for accessing Google Palm was corrected by @antonme in PR [#266](https://github.com/danny-avila/chatgpt-clone/pull/266).
3. @techwithanirudh added a Popup Menu to save sidebar space in PR [#260](https://github.com/danny-avila/chatgpt-clone/pull/260).
4. The default pageSize in Conversation.js was increased from 12 to 14 by @danny-avila in PR [#267](https://github.com/danny-avila/chatgpt-clone/pull/267).
5. Fonts were updated by @techwithanirudh in PR [#261](https://github.com/danny-avila/chatgpt-clone/pull/261).
6. Font file paths in style.css were changed by @danny-avila in PR [#268](https://github.com/danny-avila/chatgpt-clone/pull/268).
7. Code was fixed to adjust max_tokens according to model selection by @p4w4n in PR [#263](https://github.com/danny-avila/chatgpt-clone/pull/263).
8. Various improvements were made, such as fixing react errors and adjusting the mobile view, by @danny-avila in PR [#269](https://github.com/danny-avila/chatgpt-clone/pull/269).
New contributors to the project include:
- @techwithanirudh, who made their first contribution in PR [#247](https://github.com/danny-avila/chatgpt-clone/pull/247).
- @antonme, who made their first contribution in PR [#266](https://github.com/danny-avila/chatgpt-clone/pull/266).
- @p4w4n, who made their first contribution in PR [#263](https://github.com/danny-avila/chatgpt-clone/pull/263).
The [full changelog can be found here](https://github.com/danny-avila/chatgpt-clone/compare/v0.4.3...v0.4.4)
⚠️ **IMPORTANT :** Since V0.4.0 You should register and login with a local account (email and password) for the first time sign-up. if you use login for the first time with a social login account (eg. Google, facebook, etc.), the conversations and presets that you created before the user system was implemented will NOT be migrated to that account.
⚠️ **Breaking - new Env Variables :** Since V0.4.0 You will need to add the new env variables from .env.example for the app to work, even if you're not using multiple users for your purposes.
For discussion and suggestion you can join us: **[community discord server](https://discord.gg/NGaa9RPCft)**
</details>
[Past Updates](CHANGELOG.md)
##
<h1>Table of Contents</h1>
@@ -96,6 +141,8 @@ For discussion and suggestion you can join us: **[community discord server](http
* [Documentation Guidelines](documents/contributions/documentation_guidelines.md)
* [Testing](documents/contributions/testing.md)
* [Pull Request Template](documents/contributions/pull_request_template.md)
* [Contributors](CONTRIBUTORS.md)
* [Trello Board](https://trello.com/b/17z094kq/chatgpt-clone)
</details>
<details>
@@ -106,6 +153,9 @@ For discussion and suggestion you can join us: **[community discord server](http
* [Feature Request Template](documents/report_templates/feature_request_template.md)
</details>
##
### [Alternative Documentation](https://chatgpt-clone.gitbook.io/chatgpt-clone-docs/get-started/docker)
##
## Contributing
@@ -121,3 +171,6 @@ For new features, components, or extensions, please open an issue and discuss be
This project is licensed under the [MIT License](LICENSE.md).
##
## Star History
[![Star History Chart](https://api.star-history.com/svg?repos=danny-avila/chatgpt-clone&type=Date)](https://star-history.com/#danny-avila/chatgpt-clone&Date)

View File

@@ -89,6 +89,33 @@ CHATGPT_MODELS=text-davinci-002-render-sha,text-davinci-002-render-paid,gpt-4
# By default, the server will use the node-chatgpt-api recommended proxy (a third party server).
# CHATGPT_REVERSE_PROXY=
##########################
# PaLM (Google) Endpoint:
##########################
# PaLM 2 Client (via Google Cloud Vertex AI API)
# Steps:
# Enable the Vertex AI API on Google Cloud:
# https://console.cloud.google.com/vertex-ai
# Create a Service Account:
# https://console.cloud.google.com/projectselector/iam-admin/serviceaccounts/create?walkthrough_id=iam--create-service-account#step_index=1
# Make sure to click 'Create and Continue' to give at least the 'Vertex AI User' role.
# Create a JSON key, rename as 'auth.json' and save it in /api/data/.
# Alternatively
# Uncomment below PALM_KEY and set as "user_provided" to allow the user to provide a Service Account key JSON from the UI.
# They will follow the steps above except for renaming the file.
# Leave blank or omit to disable this endpoint
# PALM_KEY="user_provided"
# In case you need a reverse proxy for this endpoint:
# GOOGLE_REVERSE_PROXY=
##########################
# Proxy: To be Used by all endpoints
##########################
PROXY=
##########################
# Search:
##########################

View File

@@ -1,5 +1,6 @@
require('dotenv').config();
const { KeyvFile } = require('keyv-file');
const { genAzureEndpoint } = require('../../utils/genAzureEndpoints');
const askClient = async ({
text,
@@ -22,12 +23,13 @@ const askClient = async ({
};
const azure = process.env.AZURE_OPENAI_API_KEY ? true : false;
const maxContextTokens = model === 'gpt-4' ? 8191 : model === 'gpt-4-32k' ? 32767 : 4095; // 1 less than maximum
const clientOptions = {
reverseProxyUrl: process.env.OPENAI_REVERSE_PROXY || null,
azure,
maxContextTokens,
modelOptions: {
model: model,
model,
temperature,
top_p,
presence_penalty,
@@ -36,18 +38,22 @@ const askClient = async ({
chatGptLabel,
promptPrefix,
proxy: process.env.PROXY || null,
debug: false
// debug: true
};
let apiKey = process.env.OPENAI_KEY;
if (azure) {
apiKey = process.env.AZURE_OPENAI_API_KEY;
clientOptions.reverseProxyUrl = `https://${process.env.AZURE_OPENAI_API_INSTANCE_NAME}.openai.azure.com/openai/deployments/${process.env.AZURE_OPENAI_API_DEPLOYMENT_NAME}/chat/completions?api-version=${process.env.AZURE_OPENAI_API_VERSION}`;
clientOptions.reverseProxyUrl = genAzureEndpoint({
azureOpenAIApiInstanceName: process.env.AZURE_OPENAI_API_INSTANCE_NAME,
azureOpenAIApiDeploymentName: process.env.AZURE_OPENAI_API_DEPLOYMENT_NAME,
azureOpenAIApiVersion: process.env.AZURE_OPENAI_API_VERSION
});
}
const client = new ChatGPTClient(apiKey, clientOptions, store);
const options = {
onProgress,
abortController,

View File

@@ -0,0 +1,390 @@
const crypto = require('crypto');
const TextStream = require('../stream');
const { google } = require('googleapis');
const { Agent, ProxyAgent } = require('undici');
const { getMessages, saveMessage, saveConvo } = require('../../models');
const { encoding_for_model: encodingForModel, get_encoding: getEncoding } = require('@dqbd/tiktoken');
const tokenizersCache = {};
class GoogleAgent {
constructor(credentials, options = {}) {
this.client_email = credentials.client_email;
this.project_id = credentials.project_id;
this.private_key = credentials.private_key;
this.setOptions(options);
this.currentDateString = new Date().toLocaleDateString('en-us', {
year: 'numeric',
month: 'long',
day: 'numeric'
});
}
constructUrl() {
return `https://us-central1-aiplatform.googleapis.com/v1/projects/${this.project_id}/locations/us-central1/publishers/google/models/${this.modelOptions.model}:predict`;
}
setOptions(options) {
if (this.options && !this.options.replaceOptions) {
// nested options aren't spread properly, so we need to do this manually
this.options.modelOptions = {
...this.options.modelOptions,
...options.modelOptions
};
delete options.modelOptions;
// now we can merge options
this.options = {
...this.options,
...options
};
} else {
this.options = options;
}
this.options.examples = this.options.examples.filter(
obj => obj.input.content !== '' && obj.output.content !== ''
);
const modelOptions = this.options.modelOptions || {};
this.modelOptions = {
...modelOptions,
// set some good defaults (check for undefined in some cases because they may be 0)
model: modelOptions.model || 'chat-bison',
temperature: typeof modelOptions.temperature === 'undefined' ? 0.2 : modelOptions.temperature, // 0 - 1, 0.2 is recommended
topP: typeof modelOptions.topP === 'undefined' ? 0.95 : modelOptions.topP, // 0 - 1, default: 0.95
topK: typeof modelOptions.topK === 'undefined' ? 40 : modelOptions.topK // 1-40, default: 40
// stop: modelOptions.stop // no stop method for now
};
this.isChatModel = this.modelOptions.model.startsWith('chat-');
const { isChatModel } = this;
this.isTextModel = this.modelOptions.model.startsWith('text-');
const { isTextModel } = this;
this.maxContextTokens = this.options.maxContextTokens || (isTextModel ? 8000 : 4096);
// The max prompt tokens is determined by the max context tokens minus the max response tokens.
// Earlier messages will be dropped until the prompt is within the limit.
this.maxResponseTokens = this.modelOptions.maxOutputTokens || 1024;
this.maxPromptTokens = this.options.maxPromptTokens || this.maxContextTokens - this.maxResponseTokens;
if (this.maxPromptTokens + this.maxResponseTokens > this.maxContextTokens) {
throw new Error(
`maxPromptTokens + maxOutputTokens (${this.maxPromptTokens} + ${this.maxResponseTokens} = ${
this.maxPromptTokens + this.maxResponseTokens
}) must be less than or equal to maxContextTokens (${this.maxContextTokens})`
);
}
this.userLabel = this.options.userLabel || 'User';
this.modelLabel = this.options.modelLabel || 'Assistant';
if (isChatModel) {
// Use these faux tokens to help the AI understand the context since we are building the chat log ourselves.
// Trying to use "<|im_start|>" causes the AI to still generate "<" or "<|" at the end sometimes for some reason,
// without tripping the stop sequences, so I'm using "||>" instead.
this.startToken = '||>';
this.endToken = '';
this.gptEncoder = this.constructor.getTokenizer('cl100k_base');
} else if (isTextModel) {
this.startToken = '<|im_start|>';
this.endToken = '<|im_end|>';
this.gptEncoder = this.constructor.getTokenizer('text-davinci-003', true, {
'<|im_start|>': 100264,
'<|im_end|>': 100265
});
} else {
// Previously I was trying to use "<|endoftext|>" but there seems to be some bug with OpenAI's token counting
// system that causes only the first "<|endoftext|>" to be counted as 1 token, and the rest are not treated
// as a single token. So we're using this instead.
this.startToken = '||>';
this.endToken = '';
try {
this.gptEncoder = this.constructor.getTokenizer(this.modelOptions.model, true);
} catch {
this.gptEncoder = this.constructor.getTokenizer('text-davinci-003', true);
}
}
if (!this.modelOptions.stop) {
const stopTokens = [this.startToken];
if (this.endToken && this.endToken !== this.startToken) {
stopTokens.push(this.endToken);
}
stopTokens.push(`\n${this.userLabel}:`);
stopTokens.push('<|diff_marker|>');
// I chose not to do one for `modelLabel` because I've never seen it happen
this.modelOptions.stop = stopTokens;
}
if (this.options.reverseProxyUrl) {
this.completionsUrl = this.options.reverseProxyUrl;
} else {
this.completionsUrl = this.constructUrl();
}
return this;
}
static getTokenizer(encoding, isModelName = false, extendSpecialTokens = {}) {
if (tokenizersCache[encoding]) {
return tokenizersCache[encoding];
}
let tokenizer;
if (isModelName) {
tokenizer = encodingForModel(encoding, extendSpecialTokens);
} else {
tokenizer = getEncoding(encoding, extendSpecialTokens);
}
tokenizersCache[encoding] = tokenizer;
return tokenizer;
}
async getClient() {
const scopes = ['https://www.googleapis.com/auth/cloud-platform'];
const jwtClient = new google.auth.JWT(this.client_email, null, this.private_key, scopes);
jwtClient.authorize((err) => {
if (err) {
console.log(err);
throw err;
}
});
return jwtClient;
}
buildPayload(input, { messages = [] }) {
let payload = {
instances: [
{
messages: [...messages, { author: this.userLabel, content: input }]
}
],
parameters: this.options.modelOptions
};
if (this.options.promptPrefix) {
payload.instances[0].context = this.options.promptPrefix;
}
if (this.options.examples.length > 0) {
payload.instances[0].examples = this.options.examples;
}
if (this.isTextModel) {
payload.instances = [
{
prompt: input
}
];
}
if (this.options.debug) {
console.debug('buildPayload');
console.dir(payload, { depth: null });
}
return payload;
}
async getCompletion(input, messages = [], abortController = null) {
if (!abortController) {
abortController = new AbortController();
}
const { debug } = this.options;
const url = this.completionsUrl;
if (debug) {
console.debug();
console.debug(url);
console.debug(this.modelOptions);
console.debug();
}
const opts = {
method: 'POST',
agent: new Agent({
bodyTimeout: 0,
headersTimeout: 0
}),
signal: abortController.signal
};
if (this.options.proxy) {
opts.agent = new ProxyAgent(this.options.proxy);
}
const client = await this.getClient();
const payload = this.buildPayload(input, { messages });
const res = await client.request({ url, method: 'POST', data: payload });
console.dir(res.data, { depth: null });
return res.data;
}
async loadHistory(conversationId, parentMessageId = null) {
if (this.options.debug) {
console.debug('Loading history for conversation', conversationId, parentMessageId);
}
if (!parentMessageId) {
return [];
}
const messages = (await getMessages({ conversationId })) || [];
if (messages.length === 0) {
this.currentMessages = [];
return [];
}
const orderedMessages = this.constructor.getMessagesForConversation(messages, parentMessageId);
return orderedMessages.map((message) => {
return {
author: message.isCreatedByUser ? this.userLabel : this.modelLabel,
content: message.content
};
});
}
async saveMessageToDatabase(message, user = null) {
await saveMessage({ ...message, unfinished: false });
await saveConvo(user, {
conversationId: message.conversationId,
endpoint: 'google',
...this.modelOptions
});
}
async sendMessage(message, opts = {}) {
if (opts && typeof opts === 'object') {
this.setOptions(opts);
}
console.log('sendMessage', message, opts);
const user = opts.user || null;
const conversationId = opts.conversationId || crypto.randomUUID();
const parentMessageId = opts.parentMessageId || '00000000-0000-0000-0000-000000000000';
const userMessageId = crypto.randomUUID();
const responseMessageId = crypto.randomUUID();
const messages = await this.loadHistory(conversationId, this.options?.parentMessageId);
const userMessage = {
messageId: userMessageId,
parentMessageId,
conversationId,
sender: 'User',
text: message,
isCreatedByUser: true
};
if (typeof opts?.getIds === 'function') {
opts.getIds({
userMessage,
conversationId,
responseMessageId
});
}
console.log('userMessage', userMessage);
await this.saveMessageToDatabase(userMessage, user);
let reply = '';
let blocked = false;
try {
const result = await this.getCompletion(message, messages, opts.abortController);
blocked = result?.predictions?.[0]?.safetyAttributes?.blocked;
reply = result?.predictions?.[0]?.candidates?.[0]?.content || result?.predictions?.[0]?.content || '';
if (blocked === true) {
reply = `Google blocked a proper response to your message:\n${JSON.stringify(
result.predictions[0].safetyAttributes
)}${reply.length > 0 ? `\nAI Response:\n${reply}` : ''}`;
}
if (this.options.debug) {
console.debug('result');
console.debug(result);
}
} catch (err) {
console.error(err);
}
if (this.options.debug) {
console.debug('options');
console.debug(this.options);
}
if (!blocked) {
const textStream = new TextStream(reply, { delay: 0.5 });
await textStream.processTextStream(opts.onProgress);
}
const responseMessage = {
messageId: responseMessageId,
conversationId,
parentMessageId: userMessage.messageId,
sender: 'PaLM2',
text: reply,
error: blocked,
isCreatedByUser: false
};
await this.saveMessageToDatabase(responseMessage, user);
return responseMessage;
}
getTokenCount(text) {
return this.gptEncoder.encode(text, 'all').length;
}
/**
* Algorithm adapted from "6. Counting tokens for chat API calls" of
* https://github.com/openai/openai-cookbook/blob/main/examples/How_to_count_tokens_with_tiktoken.ipynb
*
* An additional 2 tokens need to be added for metadata after all messages have been counted.
*
* @param {*} message
*/
getTokenCountForMessage(message) {
// Map each property of the message to the number of tokens it contains
const propertyTokenCounts = Object.entries(message).map(([key, value]) => {
// Count the number of tokens in the property value
const numTokens = this.getTokenCount(value);
// Subtract 1 token if the property key is 'name'
const adjustment = key === 'name' ? 1 : 0;
return numTokens - adjustment;
});
// Sum the number of tokens in all properties and add 4 for metadata
return propertyTokenCounts.reduce((a, b) => a + b, 4);
}
/**
* Iterate through messages, building an array based on the parentMessageId.
* Each message has an id and a parentMessageId. The parentMessageId is the id of the message that this message is a reply to.
* @param messages
* @param parentMessageId
* @returns {*[]} An array containing the messages in the order they should be displayed, starting with the root message.
*/
static getMessagesForConversation(messages, parentMessageId) {
const orderedMessages = [];
let currentMessageId = parentMessageId;
while (currentMessageId) {
// eslint-disable-next-line no-loop-func
const message = messages.find(m => m.messageId === currentMessageId);
if (!message) {
break;
}
orderedMessages.unshift(message);
currentMessageId = message.parentMessageId;
}
if (orderedMessages.length === 0) {
return [];
}
return orderedMessages.map(msg => ({
isCreatedByUser: msg.isCreatedByUser,
content: msg.text
}));
}
}
module.exports = GoogleAgent;

62
api/app/stream.js Normal file
View File

@@ -0,0 +1,62 @@
const { Readable } = require('stream');
class TextStream extends Readable {
constructor(text, options = {}) {
super(options);
this.text = text;
this.currentIndex = 0;
this.delay = options.delay || 20; // Time in milliseconds
}
_read() {
const minChunkSize = 2;
const maxChunkSize = 4;
const { delay } = this;
if (this.currentIndex < this.text.length) {
setTimeout(() => {
const remainingChars = this.text.length - this.currentIndex;
const chunkSize = Math.min(
this.randomInt(minChunkSize, maxChunkSize + 1),
remainingChars
);
const chunk = this.text.slice(this.currentIndex, this.currentIndex + chunkSize);
this.push(chunk);
this.currentIndex += chunkSize;
}, delay);
} else {
this.push(null); // signal end of data
}
}
randomInt(min, max) {
return Math.floor(Math.random() * (max - min)) + min;
}
async processTextStream(onProgressCallback) {
const streamPromise = new Promise((resolve, reject) => {
this.on('data', (chunk) => {
onProgressCallback(chunk.toString());
});
this.on('end', () => {
console.log('Stream ended');
resolve();
});
this.on('error', (err) => {
reject(err);
});
});
try {
await streamPromise;
} catch (err) {
console.error('Error processing text stream:', err);
// Handle the error appropriately, e.g., return an error message or throw an error
}
}
}
module.exports = TextStream;

View File

@@ -1,7 +1,8 @@
const { Configuration, OpenAIApi } = require('openai');
const _ = require('lodash');
const { genAzureEndpoint } = require('../utils/genAzureEndpoints');
const proxyEnvToAxiosProxy = proxyString => {
const proxyEnvToAxiosProxy = (proxyString) => {
if (!proxyString) return null;
const regex = /^([^:]+):\/\/(?:([^:@]*):?([^:@]*)@)?([^:]+)(?::(\d+))?/;
@@ -33,7 +34,9 @@ const titleConvo = async ({ endpoint, text, response }) => {
||>Title:`
};
const azure = process.env.AZURE_OPENAI_API_KEY ? true : false;
const options = {
azure,
reverseProxyUrl: process.env.OPENAI_REVERSE_PROXY || null,
proxy: process.env.PROXY || null
};
@@ -47,9 +50,20 @@ const titleConvo = async ({ endpoint, text, response }) => {
frequency_penalty: 0
};
const titleGenClient = new ChatGPTClient(process.env.OPENAI_KEY, titleGenClientOptions);
let apiKey = process.env.OPENAI_KEY;
if (azure) {
apiKey = process.env.AZURE_OPENAI_API_KEY;
titleGenClientOptions.reverseProxyUrl = genAzureEndpoint({
azureOpenAIApiInstanceName: process.env.AZURE_OPENAI_API_INSTANCE_NAME,
azureOpenAIApiDeploymentName: process.env.AZURE_OPENAI_API_DEPLOYMENT_NAME,
azureOpenAIApiVersion: process.env.AZURE_OPENAI_API_VERSION
});
}
const titleGenClient = new ChatGPTClient(apiKey, titleGenClientOptions);
const result = await titleGenClient.getCompletion([instructionsPayload], null);
title = result.choices[0].message.content.replace(/\s+/g, ' ').trim();
title = result.choices[0].message.content.replace(/\s+/g, ' ').replaceAll('"', '').trim();
} catch (e) {
console.error(e);
console.log('There was an issue generating title, see error above');

View File

@@ -2,7 +2,8 @@
const regex = / \[.*?]\(.*?\)/g;
const getCitations = (res) => {
const textBlocks = res.details.adaptiveCards[0].body;
const adaptiveCards = res.details.adaptiveCards;
const textBlocks = adaptiveCards && adaptiveCards[0].body;
if (!textBlocks) return '';
let links = textBlocks[textBlocks.length - 1]?.text.match(regex);
if (links?.length === 0 || !links) return '';

View File

@@ -30,7 +30,7 @@ module.exports = {
return { message: 'Error saving conversation' };
}
},
getConvosByPage: async (user, pageNumber = 1, pageSize = 12) => {
getConvosByPage: async (user, pageNumber = 1, pageSize = 14) => {
try {
const totalConvos = (await Conversation.countDocuments({ user })) || 1;
const totalPages = Math.ceil(totalConvos / pageSize);
@@ -45,7 +45,7 @@ module.exports = {
return { message: 'Error getting conversations' };
}
},
getConvosQueried: async (user, convoIds, pageNumber = 1, pageSize = 12) => {
getConvosQueried: async (user, convoIds, pageNumber = 1, pageSize = 14) => {
try {
if (!convoIds || convoIds.length === 0) {
return { conversations: [], pages: 1, pageNumber, pageSize };
@@ -57,7 +57,7 @@ module.exports = {
// will handle a syncing solution soon
const deletedConvoIds = [];
convoIds.forEach((convo) =>
convoIds.forEach(convo =>
promises.push(
Conversation.findOne({
user,
@@ -120,7 +120,7 @@ module.exports = {
},
deleteConvos: async (user, filter) => {
let toRemove = await Conversation.find({ ...filter, user }).select('conversationId');
const ids = toRemove.map((instance) => instance.conversationId);
const ids = toRemove.map(instance => instance.conversationId);
let deleteCount = await Conversation.deleteMany({ ...filter, user }).exec();
deleteCount.messages = await deleteMessages({ conversationId: { $in: ids } });
return deleteCount;

View File

@@ -17,6 +17,12 @@ module.exports = {
default: null,
required: false
},
// for google only
modelLabel: {
type: String,
default: null,
required: false
},
promptPrefix: {
type: String,
default: null,
@@ -32,6 +38,22 @@ module.exports = {
default: 1,
required: false
},
// for google only
topP: {
type: Number,
default: 0.95,
required: false
},
topK: {
type: Number,
default: 40,
required: false
},
maxOutputTokens: {
type: Number,
default: 1024,
required: false
},
presence_penalty: {
type: Number,
default: 0,

View File

@@ -20,6 +20,8 @@ const convoSchema = mongoose.Schema(
default: null
},
messages: [{ type: mongoose.Schema.Types.ObjectId, ref: 'Message' }],
// google only
examples: [{ type: mongoose.Schema.Types.Mixed }],
...conversationPreset,
// for bingAI only
jailbreakConversationId: {

View File

@@ -17,6 +17,8 @@ const presetSchema = mongoose.Schema(
type: String,
default: null
},
// google only
examples: [{ type: mongoose.Schema.Types.Mixed }],
...conversationPreset
},
{ timestamps: true }

374
api/package-lock.json generated
View File

@@ -1,12 +1,12 @@
{
"name": "chatgpt-clone",
"version": "0.4.1",
"version": "0.4.4",
"lockfileVersion": 2,
"requires": true,
"packages": {
"": {
"name": "chatgpt-clone",
"version": "0.4.1",
"version": "0.4.4",
"license": "ISC",
"dependencies": {
"@dqbd/tiktoken": "^1.0.2",
@@ -22,6 +22,7 @@
"dotenv": "^16.0.3",
"eslint": "^8.36.0",
"express": "^4.18.2",
"googleapis": "^118.0.0",
"handlebars": "^4.7.7",
"html": "^1.0.0",
"joi": "^14.3.1",
@@ -39,6 +40,7 @@
"passport-google-oauth20": "^2.0.0",
"passport-jwt": "^4.0.1",
"passport-local": "^1.0.0",
"pino": "^8.12.1",
"sanitize": "^2.1.2"
},
"devDependencies": {
@@ -1997,6 +1999,14 @@
"resolved": "https://registry.npmjs.org/array-flatten/-/array-flatten-1.1.1.tgz",
"integrity": "sha512-PCVAQswWemu6UdxsDFFX/+gVeYqKAod3D3UVm91jHwynguOwAvYPhx8nNlM++NqRcK6CxxpUafjmhIdKiHibqg=="
},
"node_modules/arrify": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/arrify/-/arrify-2.0.1.tgz",
"integrity": "sha512-3duEwti880xqi4eAMN8AyR4a0ByT90zoYdLlevfrvU43vb0YZwZVfxOgxWrLXXXpyugL0hNZc9G6BiB5B3nUug==",
"engines": {
"node": ">=8"
}
},
"node_modules/asynckit": {
"version": "0.4.0",
"resolved": "https://registry.npmjs.org/asynckit/-/asynckit-0.4.0.tgz",
@@ -2080,6 +2090,14 @@
"resolved": "https://registry.npmjs.org/bcryptjs/-/bcryptjs-2.4.3.tgz",
"integrity": "sha512-V/Hy/X9Vt7f3BbPJEi8BdVFMByHi+jNXrYkW3huaybV/kQ0KJg0Y6PkEMbn+zeT+i+SiKZ/HMqJGIIt4LZDqNQ=="
},
"node_modules/bignumber.js": {
"version": "9.1.1",
"resolved": "https://registry.npmjs.org/bignumber.js/-/bignumber.js-9.1.1.tgz",
"integrity": "sha512-pHm4LsMJ6lzgNGVfZHjMoO8sdoRhOzOH4MLmY65Jg70bpxCKu5iOHNJyfF6OyvYw7t8Fpf35RuzUyqnQsj8Vig==",
"engines": {
"node": "*"
}
},
"node_modules/binary-extensions": {
"version": "2.2.0",
"resolved": "https://registry.npmjs.org/binary-extensions/-/binary-extensions-2.2.0.tgz",
@@ -3016,6 +3034,11 @@
"resolved": "https://registry.npmjs.org/ms/-/ms-2.0.0.tgz",
"integrity": "sha512-Tpp60P6IUJDTuOq/5Z8cdskzJujfwqfOTkrwIwj7IRISpnkJnT6SyJ4PCPnGMoFjC9ddhal5KVIYtAt97ix05A=="
},
"node_modules/extend": {
"version": "3.0.2",
"resolved": "https://registry.npmjs.org/extend/-/extend-3.0.2.tgz",
"integrity": "sha512-fjquC59cD7CyW6urNXK0FBufkZcoiGG80wTuPujX590cB5Ttln20E2UB4S/WARVqhXffZl2LNgS+gQdPIIim/g=="
},
"node_modules/external-editor": {
"version": "3.1.0",
"resolved": "https://registry.npmjs.org/external-editor/-/external-editor-3.1.0.tgz",
@@ -3108,6 +3131,11 @@
"node": ">=6"
}
},
"node_modules/fast-text-encoding": {
"version": "1.0.6",
"resolved": "https://registry.npmjs.org/fast-text-encoding/-/fast-text-encoding-1.0.6.tgz",
"integrity": "sha512-VhXlQgj9ioXCqGstD37E/HBeqEGV/qOD/kmbVG8h5xKBYvM1L3lR1Zn4555cQ8GkYbJa8aJSipLPndE1k6zK2w=="
},
"node_modules/fast-uri": {
"version": "2.2.0",
"resolved": "https://registry.npmjs.org/fast-uri/-/fast-uri-2.2.0.tgz",
@@ -3444,6 +3472,32 @@
"node": ">=8"
}
},
"node_modules/gaxios": {
"version": "5.1.0",
"resolved": "https://registry.npmjs.org/gaxios/-/gaxios-5.1.0.tgz",
"integrity": "sha512-aezGIjb+/VfsJtIcHGcBSerNEDdfdHeMros+RbYbGpmonKWQCOVOes0LVZhn1lDtIgq55qq0HaxymIoae3Fl/A==",
"dependencies": {
"extend": "^3.0.2",
"https-proxy-agent": "^5.0.0",
"is-stream": "^2.0.0",
"node-fetch": "^2.6.7"
},
"engines": {
"node": ">=12"
}
},
"node_modules/gcp-metadata": {
"version": "5.2.0",
"resolved": "https://registry.npmjs.org/gcp-metadata/-/gcp-metadata-5.2.0.tgz",
"integrity": "sha512-aFhhvvNycky2QyhG+dcfEdHBF0FRbYcf39s6WNHUDysKSrbJ5vuFbjydxBcmewtXeV248GP8dWT3ByPNxsyHCw==",
"dependencies": {
"gaxios": "^5.0.0",
"json-bigint": "^1.0.0"
},
"engines": {
"node": ">=12"
}
},
"node_modules/get-intrinsic": {
"version": "1.2.0",
"resolved": "https://registry.npmjs.org/get-intrinsic/-/get-intrinsic-1.2.0.tgz",
@@ -3528,6 +3582,94 @@
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/google-auth-library": {
"version": "8.8.0",
"resolved": "https://registry.npmjs.org/google-auth-library/-/google-auth-library-8.8.0.tgz",
"integrity": "sha512-0iJn7IDqObDG5Tu9Tn2WemmJ31ksEa96IyK0J0OZCpTh6CrC6FrattwKX87h3qKVuprCJpdOGKc1Xi8V0kMh8Q==",
"dependencies": {
"arrify": "^2.0.0",
"base64-js": "^1.3.0",
"ecdsa-sig-formatter": "^1.0.11",
"fast-text-encoding": "^1.0.0",
"gaxios": "^5.0.0",
"gcp-metadata": "^5.2.0",
"gtoken": "^6.1.0",
"jws": "^4.0.0",
"lru-cache": "^6.0.0"
},
"engines": {
"node": ">=12"
}
},
"node_modules/google-auth-library/node_modules/jwa": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/jwa/-/jwa-2.0.0.tgz",
"integrity": "sha512-jrZ2Qx916EA+fq9cEAeCROWPTfCwi1IVHqT2tapuqLEVVDKFDENFw1oL+MwrTvH6msKxsd1YTDVw6uKEcsrLEA==",
"dependencies": {
"buffer-equal-constant-time": "1.0.1",
"ecdsa-sig-formatter": "1.0.11",
"safe-buffer": "^5.0.1"
}
},
"node_modules/google-auth-library/node_modules/jws": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/jws/-/jws-4.0.0.tgz",
"integrity": "sha512-KDncfTmOZoOMTFG4mBlG0qUIOlc03fmzH+ru6RgYVZhPkyiy/92Owlt/8UEN+a4TXR1FQetfIpJE8ApdvdVxTg==",
"dependencies": {
"jwa": "^2.0.0",
"safe-buffer": "^5.0.1"
}
},
"node_modules/google-p12-pem": {
"version": "4.0.1",
"resolved": "https://registry.npmjs.org/google-p12-pem/-/google-p12-pem-4.0.1.tgz",
"integrity": "sha512-WPkN4yGtz05WZ5EhtlxNDWPhC4JIic6G8ePitwUWy4l+XPVYec+a0j0Ts47PDtW59y3RwAhUd9/h9ZZ63px6RQ==",
"dependencies": {
"node-forge": "^1.3.1"
},
"bin": {
"gp12-pem": "build/src/bin/gp12-pem.js"
},
"engines": {
"node": ">=12.0.0"
}
},
"node_modules/googleapis": {
"version": "118.0.0",
"resolved": "https://registry.npmjs.org/googleapis/-/googleapis-118.0.0.tgz",
"integrity": "sha512-Ny6zJOGn5P/YDT6GQbJU6K0lSzEu4Yuxnsn45ZgBIeSQ1RM0FolEjUToLXquZd89DU9wUfqA5XYHPEctk1TFWg==",
"dependencies": {
"google-auth-library": "^8.0.2",
"googleapis-common": "^6.0.0"
},
"engines": {
"node": ">=12.0.0"
}
},
"node_modules/googleapis-common": {
"version": "6.0.4",
"resolved": "https://registry.npmjs.org/googleapis-common/-/googleapis-common-6.0.4.tgz",
"integrity": "sha512-m4ErxGE8unR1z0VajT6AYk3s6a9gIMM6EkDZfkPnES8joeOlEtFEJeF8IyZkb0tjPXkktUfYrE4b3Li1DNyOwA==",
"dependencies": {
"extend": "^3.0.2",
"gaxios": "^5.0.1",
"google-auth-library": "^8.0.2",
"qs": "^6.7.0",
"url-template": "^2.0.8",
"uuid": "^9.0.0"
},
"engines": {
"node": ">=12.0.0"
}
},
"node_modules/googleapis-common/node_modules/uuid": {
"version": "9.0.0",
"resolved": "https://registry.npmjs.org/uuid/-/uuid-9.0.0.tgz",
"integrity": "sha512-MXcSTerfPa4uqyzStbRoTgt5XIe3x5+42+q1sDuy3R5MDk66URdLMOZe5aPX/SQd+kuYAh0FdP/pO28IkQyTeg==",
"bin": {
"uuid": "dist/bin/uuid"
}
},
"node_modules/graceful-fs": {
"version": "4.2.11",
"resolved": "https://registry.npmjs.org/graceful-fs/-/graceful-fs-4.2.11.tgz",
@@ -3538,6 +3680,38 @@
"resolved": "https://registry.npmjs.org/grapheme-splitter/-/grapheme-splitter-1.0.4.tgz",
"integrity": "sha512-bzh50DW9kTPM00T8y4o8vQg89Di9oLJVLW/KaOGIXJWP/iqCN6WKYkbNOF04vFLJhwcpYUh9ydh/+5vpOqV4YQ=="
},
"node_modules/gtoken": {
"version": "6.1.2",
"resolved": "https://registry.npmjs.org/gtoken/-/gtoken-6.1.2.tgz",
"integrity": "sha512-4ccGpzz7YAr7lxrT2neugmXQ3hP9ho2gcaityLVkiUecAiwiy60Ii8gRbZeOsXV19fYaRjgBSshs8kXw+NKCPQ==",
"dependencies": {
"gaxios": "^5.0.1",
"google-p12-pem": "^4.0.0",
"jws": "^4.0.0"
},
"engines": {
"node": ">=12.0.0"
}
},
"node_modules/gtoken/node_modules/jwa": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/jwa/-/jwa-2.0.0.tgz",
"integrity": "sha512-jrZ2Qx916EA+fq9cEAeCROWPTfCwi1IVHqT2tapuqLEVVDKFDENFw1oL+MwrTvH6msKxsd1YTDVw6uKEcsrLEA==",
"dependencies": {
"buffer-equal-constant-time": "1.0.1",
"ecdsa-sig-formatter": "1.0.11",
"safe-buffer": "^5.0.1"
}
},
"node_modules/gtoken/node_modules/jws": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/jws/-/jws-4.0.0.tgz",
"integrity": "sha512-KDncfTmOZoOMTFG4mBlG0qUIOlc03fmzH+ru6RgYVZhPkyiy/92Owlt/8UEN+a4TXR1FQetfIpJE8ApdvdVxTg==",
"dependencies": {
"jwa": "^2.0.0",
"safe-buffer": "^5.0.1"
}
},
"node_modules/handlebars": {
"version": "4.7.7",
"resolved": "https://registry.npmjs.org/handlebars/-/handlebars-4.7.7.tgz",
@@ -4024,6 +4198,14 @@
"js-yaml": "bin/js-yaml.js"
}
},
"node_modules/json-bigint": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/json-bigint/-/json-bigint-1.0.0.tgz",
"integrity": "sha512-SiPv/8VpZuWbvLSMtTDU8hEfrZWg/mH/nV/b4o0CYbSxu1UIQPLdwKOCIyLQX+VIPO5vrLX3i8qtqFyhdPSUSQ==",
"dependencies": {
"bignumber.js": "^9.0.0"
}
},
"node_modules/json-buffer": {
"version": "3.0.1",
"resolved": "https://registry.npmjs.org/json-buffer/-/json-buffer-3.0.1.tgz",
@@ -4485,6 +4667,14 @@
"webidl-conversions": "^3.0.0"
}
},
"node_modules/node-forge": {
"version": "1.3.1",
"resolved": "https://registry.npmjs.org/node-forge/-/node-forge-1.3.1.tgz",
"integrity": "sha512-dPEtOeMvF9VMcYV/1Wb8CPoVAXtp6MKMlcbAt4ddqmGqUJ6fQZFXkNZNkNlfevtNkGtaSoXf/vNNNSvgrdXwtA==",
"engines": {
"node": ">= 6.13.0"
}
},
"node_modules/nodemailer": {
"version": "6.9.1",
"resolved": "https://registry.npmjs.org/nodemailer/-/nodemailer-6.9.1.tgz",
@@ -4989,9 +5179,9 @@
}
},
"node_modules/pino": {
"version": "8.11.0",
"resolved": "https://registry.npmjs.org/pino/-/pino-8.11.0.tgz",
"integrity": "sha512-Z2eKSvlrl2rH8p5eveNUnTdd4AjJk8tAsLkHYZQKGHP4WTh2Gi1cOSOs3eWPqaj+niS3gj4UkoreoaWgF3ZWYg==",
"version": "8.12.1",
"resolved": "https://registry.npmjs.org/pino/-/pino-8.12.1.tgz",
"integrity": "sha512-n4F7nrEUDH0u5pjlUPkrvsN6oZOhScZWwpWPniwVkMnO6a2DhpnMNWDUlW0SawVICgrNhOf0yTNgw2lYQBsPYA==",
"dependencies": {
"atomic-sleep": "^1.0.0",
"fast-redact": "^3.1.1",
@@ -5902,6 +6092,11 @@
"punycode": "^2.1.0"
}
},
"node_modules/url-template": {
"version": "2.0.8",
"resolved": "https://registry.npmjs.org/url-template/-/url-template-2.0.8.tgz",
"integrity": "sha512-XdVKMF4SJ0nP/O7XIPB0JwAEuT9lDIYnNsK8yGVe43y0AWoKeJNdv3ZNWh7ksJ6KqQFjOO6ox/VEitLnaVNufw=="
},
"node_modules/util": {
"version": "0.10.4",
"resolved": "https://registry.npmjs.org/util/-/util-0.10.4.tgz",
@@ -7900,6 +8095,11 @@
"resolved": "https://registry.npmjs.org/array-flatten/-/array-flatten-1.1.1.tgz",
"integrity": "sha512-PCVAQswWemu6UdxsDFFX/+gVeYqKAod3D3UVm91jHwynguOwAvYPhx8nNlM++NqRcK6CxxpUafjmhIdKiHibqg=="
},
"arrify": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/arrify/-/arrify-2.0.1.tgz",
"integrity": "sha512-3duEwti880xqi4eAMN8AyR4a0ByT90zoYdLlevfrvU43vb0YZwZVfxOgxWrLXXXpyugL0hNZc9G6BiB5B3nUug=="
},
"asynckit": {
"version": "0.4.0",
"resolved": "https://registry.npmjs.org/asynckit/-/asynckit-0.4.0.tgz",
@@ -7959,6 +8159,11 @@
"resolved": "https://registry.npmjs.org/bcryptjs/-/bcryptjs-2.4.3.tgz",
"integrity": "sha512-V/Hy/X9Vt7f3BbPJEi8BdVFMByHi+jNXrYkW3huaybV/kQ0KJg0Y6PkEMbn+zeT+i+SiKZ/HMqJGIIt4LZDqNQ=="
},
"bignumber.js": {
"version": "9.1.1",
"resolved": "https://registry.npmjs.org/bignumber.js/-/bignumber.js-9.1.1.tgz",
"integrity": "sha512-pHm4LsMJ6lzgNGVfZHjMoO8sdoRhOzOH4MLmY65Jg70bpxCKu5iOHNJyfF6OyvYw7t8Fpf35RuzUyqnQsj8Vig=="
},
"binary-extensions": {
"version": "2.2.0",
"resolved": "https://registry.npmjs.org/binary-extensions/-/binary-extensions-2.2.0.tgz",
@@ -8643,6 +8848,11 @@
}
}
},
"extend": {
"version": "3.0.2",
"resolved": "https://registry.npmjs.org/extend/-/extend-3.0.2.tgz",
"integrity": "sha512-fjquC59cD7CyW6urNXK0FBufkZcoiGG80wTuPujX590cB5Ttln20E2UB4S/WARVqhXffZl2LNgS+gQdPIIim/g=="
},
"external-editor": {
"version": "3.1.0",
"resolved": "https://registry.npmjs.org/external-editor/-/external-editor-3.1.0.tgz",
@@ -8727,6 +8937,11 @@
"resolved": "https://registry.npmjs.org/fast-redact/-/fast-redact-3.1.2.tgz",
"integrity": "sha512-+0em+Iya9fKGfEQGcd62Yv6onjBmmhV1uh86XVfOU8VwAe6kaFdQCWI9s0/Nnugx5Vd9tdbZ7e6gE2tR9dzXdw=="
},
"fast-text-encoding": {
"version": "1.0.6",
"resolved": "https://registry.npmjs.org/fast-text-encoding/-/fast-text-encoding-1.0.6.tgz",
"integrity": "sha512-VhXlQgj9ioXCqGstD37E/HBeqEGV/qOD/kmbVG8h5xKBYvM1L3lR1Zn4555cQ8GkYbJa8aJSipLPndE1k6zK2w=="
},
"fast-uri": {
"version": "2.2.0",
"resolved": "https://registry.npmjs.org/fast-uri/-/fast-uri-2.2.0.tgz",
@@ -8982,6 +9197,26 @@
}
}
},
"gaxios": {
"version": "5.1.0",
"resolved": "https://registry.npmjs.org/gaxios/-/gaxios-5.1.0.tgz",
"integrity": "sha512-aezGIjb+/VfsJtIcHGcBSerNEDdfdHeMros+RbYbGpmonKWQCOVOes0LVZhn1lDtIgq55qq0HaxymIoae3Fl/A==",
"requires": {
"extend": "^3.0.2",
"https-proxy-agent": "^5.0.0",
"is-stream": "^2.0.0",
"node-fetch": "^2.6.7"
}
},
"gcp-metadata": {
"version": "5.2.0",
"resolved": "https://registry.npmjs.org/gcp-metadata/-/gcp-metadata-5.2.0.tgz",
"integrity": "sha512-aFhhvvNycky2QyhG+dcfEdHBF0FRbYcf39s6WNHUDysKSrbJ5vuFbjydxBcmewtXeV248GP8dWT3ByPNxsyHCw==",
"requires": {
"gaxios": "^5.0.0",
"json-bigint": "^1.0.0"
}
},
"get-intrinsic": {
"version": "1.2.0",
"resolved": "https://registry.npmjs.org/get-intrinsic/-/get-intrinsic-1.2.0.tgz",
@@ -9038,6 +9273,80 @@
}
}
},
"google-auth-library": {
"version": "8.8.0",
"resolved": "https://registry.npmjs.org/google-auth-library/-/google-auth-library-8.8.0.tgz",
"integrity": "sha512-0iJn7IDqObDG5Tu9Tn2WemmJ31ksEa96IyK0J0OZCpTh6CrC6FrattwKX87h3qKVuprCJpdOGKc1Xi8V0kMh8Q==",
"requires": {
"arrify": "^2.0.0",
"base64-js": "^1.3.0",
"ecdsa-sig-formatter": "^1.0.11",
"fast-text-encoding": "^1.0.0",
"gaxios": "^5.0.0",
"gcp-metadata": "^5.2.0",
"gtoken": "^6.1.0",
"jws": "^4.0.0",
"lru-cache": "^6.0.0"
},
"dependencies": {
"jwa": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/jwa/-/jwa-2.0.0.tgz",
"integrity": "sha512-jrZ2Qx916EA+fq9cEAeCROWPTfCwi1IVHqT2tapuqLEVVDKFDENFw1oL+MwrTvH6msKxsd1YTDVw6uKEcsrLEA==",
"requires": {
"buffer-equal-constant-time": "1.0.1",
"ecdsa-sig-formatter": "1.0.11",
"safe-buffer": "^5.0.1"
}
},
"jws": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/jws/-/jws-4.0.0.tgz",
"integrity": "sha512-KDncfTmOZoOMTFG4mBlG0qUIOlc03fmzH+ru6RgYVZhPkyiy/92Owlt/8UEN+a4TXR1FQetfIpJE8ApdvdVxTg==",
"requires": {
"jwa": "^2.0.0",
"safe-buffer": "^5.0.1"
}
}
}
},
"google-p12-pem": {
"version": "4.0.1",
"resolved": "https://registry.npmjs.org/google-p12-pem/-/google-p12-pem-4.0.1.tgz",
"integrity": "sha512-WPkN4yGtz05WZ5EhtlxNDWPhC4JIic6G8ePitwUWy4l+XPVYec+a0j0Ts47PDtW59y3RwAhUd9/h9ZZ63px6RQ==",
"requires": {
"node-forge": "^1.3.1"
}
},
"googleapis": {
"version": "118.0.0",
"resolved": "https://registry.npmjs.org/googleapis/-/googleapis-118.0.0.tgz",
"integrity": "sha512-Ny6zJOGn5P/YDT6GQbJU6K0lSzEu4Yuxnsn45ZgBIeSQ1RM0FolEjUToLXquZd89DU9wUfqA5XYHPEctk1TFWg==",
"requires": {
"google-auth-library": "^8.0.2",
"googleapis-common": "^6.0.0"
}
},
"googleapis-common": {
"version": "6.0.4",
"resolved": "https://registry.npmjs.org/googleapis-common/-/googleapis-common-6.0.4.tgz",
"integrity": "sha512-m4ErxGE8unR1z0VajT6AYk3s6a9gIMM6EkDZfkPnES8joeOlEtFEJeF8IyZkb0tjPXkktUfYrE4b3Li1DNyOwA==",
"requires": {
"extend": "^3.0.2",
"gaxios": "^5.0.1",
"google-auth-library": "^8.0.2",
"qs": "^6.7.0",
"url-template": "^2.0.8",
"uuid": "^9.0.0"
},
"dependencies": {
"uuid": {
"version": "9.0.0",
"resolved": "https://registry.npmjs.org/uuid/-/uuid-9.0.0.tgz",
"integrity": "sha512-MXcSTerfPa4uqyzStbRoTgt5XIe3x5+42+q1sDuy3R5MDk66URdLMOZe5aPX/SQd+kuYAh0FdP/pO28IkQyTeg=="
}
}
},
"graceful-fs": {
"version": "4.2.11",
"resolved": "https://registry.npmjs.org/graceful-fs/-/graceful-fs-4.2.11.tgz",
@@ -9048,6 +9357,37 @@
"resolved": "https://registry.npmjs.org/grapheme-splitter/-/grapheme-splitter-1.0.4.tgz",
"integrity": "sha512-bzh50DW9kTPM00T8y4o8vQg89Di9oLJVLW/KaOGIXJWP/iqCN6WKYkbNOF04vFLJhwcpYUh9ydh/+5vpOqV4YQ=="
},
"gtoken": {
"version": "6.1.2",
"resolved": "https://registry.npmjs.org/gtoken/-/gtoken-6.1.2.tgz",
"integrity": "sha512-4ccGpzz7YAr7lxrT2neugmXQ3hP9ho2gcaityLVkiUecAiwiy60Ii8gRbZeOsXV19fYaRjgBSshs8kXw+NKCPQ==",
"requires": {
"gaxios": "^5.0.1",
"google-p12-pem": "^4.0.0",
"jws": "^4.0.0"
},
"dependencies": {
"jwa": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/jwa/-/jwa-2.0.0.tgz",
"integrity": "sha512-jrZ2Qx916EA+fq9cEAeCROWPTfCwi1IVHqT2tapuqLEVVDKFDENFw1oL+MwrTvH6msKxsd1YTDVw6uKEcsrLEA==",
"requires": {
"buffer-equal-constant-time": "1.0.1",
"ecdsa-sig-formatter": "1.0.11",
"safe-buffer": "^5.0.1"
}
},
"jws": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/jws/-/jws-4.0.0.tgz",
"integrity": "sha512-KDncfTmOZoOMTFG4mBlG0qUIOlc03fmzH+ru6RgYVZhPkyiy/92Owlt/8UEN+a4TXR1FQetfIpJE8ApdvdVxTg==",
"requires": {
"jwa": "^2.0.0",
"safe-buffer": "^5.0.1"
}
}
}
},
"handlebars": {
"version": "4.7.7",
"resolved": "https://registry.npmjs.org/handlebars/-/handlebars-4.7.7.tgz",
@@ -9386,6 +9726,14 @@
"argparse": "^2.0.1"
}
},
"json-bigint": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/json-bigint/-/json-bigint-1.0.0.tgz",
"integrity": "sha512-SiPv/8VpZuWbvLSMtTDU8hEfrZWg/mH/nV/b4o0CYbSxu1UIQPLdwKOCIyLQX+VIPO5vrLX3i8qtqFyhdPSUSQ==",
"requires": {
"bignumber.js": "^9.0.0"
}
},
"json-buffer": {
"version": "3.0.1",
"resolved": "https://registry.npmjs.org/json-buffer/-/json-buffer-3.0.1.tgz",
@@ -9746,6 +10094,11 @@
}
}
},
"node-forge": {
"version": "1.3.1",
"resolved": "https://registry.npmjs.org/node-forge/-/node-forge-1.3.1.tgz",
"integrity": "sha512-dPEtOeMvF9VMcYV/1Wb8CPoVAXtp6MKMlcbAt4ddqmGqUJ6fQZFXkNZNkNlfevtNkGtaSoXf/vNNNSvgrdXwtA=="
},
"nodemailer": {
"version": "6.9.1",
"resolved": "https://registry.npmjs.org/nodemailer/-/nodemailer-6.9.1.tgz",
@@ -10116,9 +10469,9 @@
"integrity": "sha512-eW/gHNMlxdSP6dmG6uJip6FXN0EQBwm2clYYd8Wul42Cwu/DK8HEftzsapcNdYe2MfLiIwZqsDk2RDEsTE79hA=="
},
"pino": {
"version": "8.11.0",
"resolved": "https://registry.npmjs.org/pino/-/pino-8.11.0.tgz",
"integrity": "sha512-Z2eKSvlrl2rH8p5eveNUnTdd4AjJk8tAsLkHYZQKGHP4WTh2Gi1cOSOs3eWPqaj+niS3gj4UkoreoaWgF3ZWYg==",
"version": "8.12.1",
"resolved": "https://registry.npmjs.org/pino/-/pino-8.12.1.tgz",
"integrity": "sha512-n4F7nrEUDH0u5pjlUPkrvsN6oZOhScZWwpWPniwVkMnO6a2DhpnMNWDUlW0SawVICgrNhOf0yTNgw2lYQBsPYA==",
"requires": {
"atomic-sleep": "^1.0.0",
"fast-redact": "^3.1.1",
@@ -10801,6 +11154,11 @@
"punycode": "^2.1.0"
}
},
"url-template": {
"version": "2.0.8",
"resolved": "https://registry.npmjs.org/url-template/-/url-template-2.0.8.tgz",
"integrity": "sha512-XdVKMF4SJ0nP/O7XIPB0JwAEuT9lDIYnNsK8yGVe43y0AWoKeJNdv3ZNWh7ksJ6KqQFjOO6ox/VEitLnaVNufw=="
},
"util": {
"version": "0.10.4",
"resolved": "https://registry.npmjs.org/util/-/util-0.10.4.tgz",

View File

@@ -1,6 +1,6 @@
{
"name": "chatgpt-clone",
"version": "0.4.1",
"name": "chat-backend",
"version": "0.4.4",
"description": "",
"main": "server/index.js",
"scripts": {
@@ -32,6 +32,7 @@
"dotenv": "^16.0.3",
"eslint": "^8.36.0",
"express": "^4.18.2",
"googleapis": "^118.0.0",
"handlebars": "^4.7.7",
"html": "^1.0.0",
"joi": "^14.3.1",
@@ -49,6 +50,7 @@
"passport-google-oauth20": "^2.0.0",
"passport-jwt": "^4.0.1",
"passport-local": "^1.0.0",
"pino": "^8.12.1",
"sanitize": "^2.1.2"
},
"devDependencies": {

View File

@@ -0,0 +1,156 @@
const express = require('express');
const router = express.Router();
const { titleConvo } = require('../../../app/');
const GoogleClient = require('../../../app/google/GoogleClient');
const { saveMessage, getConvoTitle, saveConvo, getConvo } = require('../../../models');
const { handleError, sendMessage, createOnProgress } = require('./handlers');
const requireJwtAuth = require('../../../middleware/requireJwtAuth');
router.post('/', requireJwtAuth, async (req, res) => {
const { endpoint, text, parentMessageId, conversationId } = req.body;
if (text.length === 0) return handleError(res, { text: 'Prompt empty or too short' });
if (endpoint !== 'google') return handleError(res, { text: 'Illegal request' });
// build endpoint option
const endpointOption = {
examples: req.body?.examples ?? [{ input: { content: '' }, output: { content: '' } }],
promptPrefix: req.body?.promptPrefix ?? null,
token: req.body?.token ?? null,
modelOptions: {
model: req.body?.model ?? 'chat-bison',
modelLabel: req.body?.modelLabel ?? null,
temperature: req.body?.temperature ?? 0.2,
maxOutputTokens: req.body?.maxOutputTokens ?? 1024,
topP: req.body?.topP ?? 0.95,
topK: req.body?.topK ?? 40
}
};
const availableModels = ['chat-bison', 'text-bison'];
if (availableModels.find(model => model === endpointOption.modelOptions.model) === undefined) {
return handleError(res, { text: `Illegal request: model` });
}
// eslint-disable-next-line no-use-before-define
return await ask({
text,
endpointOption,
conversationId,
parentMessageId,
req,
res
});
});
const ask = async ({ text, endpointOption, parentMessageId = null, conversationId, req, res }) => {
res.writeHead(200, {
Connection: 'keep-alive',
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache, no-transform',
'Access-Control-Allow-Origin': '*',
'X-Accel-Buffering': 'no'
});
let userMessage;
let userMessageId;
let responseMessageId;
let lastSavedTimestamp = 0;
try {
const getIds = (data) => {
userMessage = data.userMessage;
userMessageId = userMessage.messageId;
responseMessageId = data.responseMessageId;
if (!conversationId) {
conversationId = data.conversationId;
}
};
const { onProgress: progressCallback } = createOnProgress({
onProgress: ({ text: partialText }) => {
const currentTimestamp = Date.now();
if (currentTimestamp - lastSavedTimestamp > 500) {
lastSavedTimestamp = currentTimestamp;
saveMessage({
messageId: responseMessageId,
sender: 'PaLM2',
conversationId,
parentMessageId: userMessageId,
text: partialText,
unfinished: true,
cancelled: false,
error: false
});
}
}
});
const abortController = new AbortController();
let key;
if (endpointOption.token) {
key = JSON.parse(endpointOption.token);
delete endpointOption.token;
console.log('Using service account key provided by User for PaLM models');
}
try {
if (!key) {
key = require('../../../data/auth.json');
}
} catch (e) {
console.log("No 'auth.json' file (service account key) found in /api/data/ for PaLM models");
}
const clientOptions = {
// debug: true, // for testing
reverseProxyUrl: process.env.GOOGLE_REVERSE_PROXY || null,
proxy: process.env.PROXY || null,
...endpointOption
};
const client = new GoogleClient(key, clientOptions);
let response = await client.sendMessage(text, {
getIds,
user: req.user.id,
parentMessageId,
conversationId,
onProgress: progressCallback.call(null, { res, text, parentMessageId: userMessageId }),
abortController
});
await saveMessage(response);
sendMessage(res, {
title: await getConvoTitle(req.user.id, conversationId),
final: true,
conversation: await getConvo(req.user.id, conversationId),
requestMessage: userMessage,
responseMessage: response
});
res.end();
if (parentMessageId == '00000000-0000-0000-0000-000000000000') {
const title = await titleConvo({ text, response });
await saveConvo(req.user.id, {
conversationId: conversationId,
title
});
}
} catch (error) {
console.error(error);
const errorMessage = {
messageId: responseMessageId,
sender: 'PaLM2',
conversationId,
parentMessageId,
unfinished: false,
cancelled: false,
error: true,
text: error.message
};
await saveMessage(errorMessage);
handleError(res, errorMessage);
}
};
module.exports = router;

View File

@@ -2,11 +2,13 @@ const express = require('express');
const router = express.Router();
// const askAzureOpenAI = require('./askAzureOpenAI';)
const askOpenAI = require('./askOpenAI');
const askGoogle = require('./askGoogle');
const askBingAI = require('./askBingAI');
const askChatGPTBrowser = require('./askChatGPTBrowser');
// router.use('/azureOpenAI', askAzureOpenAI);
router.use('/openAI', askOpenAI);
router.use('/google', askGoogle);
router.use('/bingAI', askBingAI);
router.use('/chatGPTBrowser', askChatGPTBrowser);

View File

@@ -15,9 +15,33 @@ const getChatGPTBrowserModels = () => {
return models;
};
router.get('/', function (req, res) {
let i = 0;
router.get('/', async function (req, res) {
let key, palmUser;
try {
key = require('../../data/auth.json');
} catch (e) {
if (i === 0) {
console.log("No 'auth.json' file (service account key) found in /api/data/ for PaLM models");
i++;
}
}
if (process.env.PALM_KEY === 'user_provided') {
palmUser = true;
if (i <= 1) {
console.log('User will provide key for PaLM models');
i++;
}
}
const google =
key || palmUser ? { userProvide: palmUser, availableModels: ['chat-bison', 'text-bison'] } : false;
const azureOpenAI = !!process.env.AZURE_OPENAI_KEY;
const openAI = process.env.OPENAI_KEY ? { availableModels: getOpenAIModels() } : false;
const openAI =
process.env.OPENAI_KEY || process.env.AZURE_OPENAI_API_KEY
? { availableModels: getOpenAIModels() }
: false;
const bingAI = process.env.BINGAI_TOKEN
? { userProvide: process.env.BINGAI_TOKEN == 'user_provided' }
: false;
@@ -28,7 +52,7 @@ router.get('/', function (req, res) {
}
: false;
res.send(JSON.stringify({ azureOpenAI, openAI, bingAI, chatGPTBrowser }));
res.send(JSON.stringify({ azureOpenAI, openAI, google, bingAI, chatGPTBrowser }));
});
module.exports = { router, getOpenAIModels, getChatGPTBrowserModels };

125
api/utils/LoggingSystem.js Normal file
View File

@@ -0,0 +1,125 @@
const pino = require('pino');
const logger = pino({
level: 'info',
redact: {
paths: [ // List of Paths to redact from the logs (https://getpino.io/#/docs/redaction)
'env.OPENAI_KEY',
'env.BINGAI_TOKEN',
'env.CHATGPT_TOKEN',
'env.MEILI_MASTER_KEY',
'env.GOOGLE_CLIENT_SECRET',
'env.JWT_SECRET_DEV',
'env.JWT_SECRET_PROD',
'newUser.password'], // See example to filter object class instances
censor: '***', // Redaction character
},
});
// Sanitize outside the logger paths. This is useful for sanitizing variables directly with Regex and patterns.
const redactPatterns = [ // Array of regular expressions for redacting patterns
/api[-_]?key/i,
/password/i,
/token/i,
/secret/i,
/key/i,
/certificate/i,
/client[-_]?id/i,
/authorization[-_]?code/i,
/authorization[-_]?login[-_]?hint/i,
/authorization[-_]?acr[-_]?values/i,
/authorization[-_]?response[-_]?mode/i,
/authorization[-_]?nonce/i
];
/*
// Example of redacting sensitive data from object class instances
function redactSensitiveData(obj) {
if (obj instanceof User) {
return {
...obj.toObject(),
password: '***', // Redact the password field
};
}
return obj;
}
// Example of redacting sensitive data from object class instances
logger.info({ newUser: redactSensitiveData(newUser) }, 'newUser');
*/
const levels = {
TRACE: 10,
DEBUG: 20,
INFO: 30,
WARN: 40,
ERROR: 50,
FATAL: 60
};
let level = levels.INFO;
module.exports = {
levels,
setLevel: (l) => (level = l),
log: {
trace: (msg) => {
if (level <= levels.TRACE) return;
logger.trace(msg);
},
debug: (msg) => {
if (level <= levels.DEBUG) return;
logger.debug(msg);
},
info: (msg) => {
if (level <= levels.INFO) return;
logger.info(msg);
},
warn: (msg) => {
if (level <= levels.WARN) return;
logger.warn(msg);
},
error: (msg) => {
if (level <= levels.ERROR) return;
logger.error(msg);
},
fatal: (msg) => {
if (level <= levels.FATAL) return;
logger.fatal(msg);
},
// Custom loggers
parameters: (parameters) => {
if (level <= levels.TRACE) return;
logger.debug({ parameters }, 'Function Parameters');
},
functionName: (name) => {
if (level <= levels.TRACE) return;
logger.debug(`EXECUTING: ${name}`);
},
flow: (flow) => {
if (level <= levels.INFO) return;
logger.debug(`BEGIN FLOW: ${flow}`);
},
variable: ({ name, value }) => {
if (level <= levels.DEBUG) return;
// Check if the variable name matches any of the redact patterns and redact the value
let sanitizedValue = value;
for (const pattern of redactPatterns) {
if (pattern.test(name)) {
sanitizedValue = '***';
break;
}
}
logger.debug({ variable: { name, value: sanitizedValue } }, `VARIABLE ${name}`);
},
request: () => (req, res, next) => {
if (level < levels.DEBUG) return next();
logger.debug({ query: req.query, body: req.body }, `Hit URL ${req.url} with following`);
return next();
}
}
};

View File

@@ -0,0 +1,5 @@
function genAzureEndpoint({ azureOpenAIApiInstanceName, azureOpenAIApiDeploymentName, azureOpenAIApiVersion }) {
return `https://${azureOpenAIApiInstanceName}.openai.azure.com/openai/deployments/${azureOpenAIApiDeploymentName}/chat/completions?api-version=${azureOpenAIApiVersion}`;
}
module.exports = { genAzureEndpoint };

350
client/package-lock.json generated
View File

@@ -1,12 +1,12 @@
{
"name": "chatgpt-clone",
"version": "0.4.1",
"version": "0.4.4",
"lockfileVersion": 2,
"requires": true,
"packages": {
"": {
"name": "chatgpt-clone",
"version": "0.4.1",
"version": "0.4.4",
"license": "ISC",
"dependencies": {
"@fortawesome/fontawesome-svg-core": "^6.4.0",
@@ -42,6 +42,7 @@
"html2canvas": "^1.4.1",
"lodash": "^4.17.21",
"lucide-react": "^0.113.0",
"pino": "^8.12.1",
"rc-input-number": "^7.4.2",
"react": "^18.2.0",
"react-dom": "^18.2.0",
@@ -3941,6 +3942,17 @@
"dev": true,
"license": "BSD-3-Clause"
},
"node_modules/abort-controller": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/abort-controller/-/abort-controller-3.0.0.tgz",
"integrity": "sha512-h8lQ8tacZYnR3vNQTgibj+tODHI5/+l06Au2Pcriv/Gmet0eaj4TwWH41sO9wnHDiQsEj19q0drzdWdeAHtweg==",
"dependencies": {
"event-target-shim": "^5.0.0"
},
"engines": {
"node": ">=6.5"
}
},
"node_modules/acorn": {
"version": "8.8.2",
"dev": true,
@@ -4191,6 +4203,14 @@
"version": "0.4.0",
"license": "MIT"
},
"node_modules/atomic-sleep": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/atomic-sleep/-/atomic-sleep-1.0.0.tgz",
"integrity": "sha512-kNOjDqAh7px0XWNI+4QbzoiR/nTkHAWNud2uvnJquD1/x5a7EQZMJT0AczqK0Qn67oY/TTQ1LbUKajZpp3I9tQ==",
"engines": {
"node": ">=8.0.0"
}
},
"node_modules/autoprefixer": {
"version": "10.4.14",
"dev": true,
@@ -4450,6 +4470,25 @@
"node": ">= 0.6.0"
}
},
"node_modules/base64-js": {
"version": "1.5.1",
"resolved": "https://registry.npmjs.org/base64-js/-/base64-js-1.5.1.tgz",
"integrity": "sha512-AKpaYlHn8t4SVbOHCy+b5+KKgvR4vrsD8vbvrbiQJps7fKDTkjkDry6ji0rUJjC0kzbNePLwzxq8iypo41qeWA==",
"funding": [
{
"type": "github",
"url": "https://github.com/sponsors/feross"
},
{
"type": "patreon",
"url": "https://www.patreon.com/feross"
},
{
"type": "consulting",
"url": "https://feross.org/support"
}
]
},
"node_modules/big.js": {
"version": "5.2.2",
"dev": true,
@@ -4577,6 +4616,29 @@
"node": "^6 || ^7 || ^8 || ^9 || ^10 || ^11 || ^12 || >=13.7"
}
},
"node_modules/buffer": {
"version": "6.0.3",
"resolved": "https://registry.npmjs.org/buffer/-/buffer-6.0.3.tgz",
"integrity": "sha512-FTiCpNxtwiZZHEZbcbTIcZjERVICn9yq/pDFkTl95/AxzD1naBctN7YO68riM/gLSDY7sdrMby8hofADYuuqOA==",
"funding": [
{
"type": "github",
"url": "https://github.com/sponsors/feross"
},
{
"type": "patreon",
"url": "https://www.patreon.com/feross"
},
{
"type": "consulting",
"url": "https://feross.org/support"
}
],
"dependencies": {
"base64-js": "^1.3.1",
"ieee754": "^1.2.1"
}
},
"node_modules/buffer-from": {
"version": "1.1.2",
"dev": true,
@@ -6060,11 +6122,17 @@
"node": ">=0.10.0"
}
},
"node_modules/event-target-shim": {
"version": "5.0.1",
"resolved": "https://registry.npmjs.org/event-target-shim/-/event-target-shim-5.0.1.tgz",
"integrity": "sha512-i/2XbnSz/uxRCU6+NdVJgKWDTM427+MqYbkQzD321DuCQJUqOuJKIA0IM2+W2xtYHdKOmZ4dR6fExsd4SXL+WQ==",
"engines": {
"node": ">=6"
}
},
"node_modules/events": {
"version": "3.3.0",
"dev": true,
"license": "MIT",
"peer": true,
"engines": {
"node": ">=0.8.x"
}
@@ -6130,6 +6198,14 @@
"dev": true,
"license": "MIT"
},
"node_modules/fast-redact": {
"version": "3.1.2",
"resolved": "https://registry.npmjs.org/fast-redact/-/fast-redact-3.1.2.tgz",
"integrity": "sha512-+0em+Iya9fKGfEQGcd62Yv6onjBmmhV1uh86XVfOU8VwAe6kaFdQCWI9s0/Nnugx5Vd9tdbZ7e6gE2tR9dzXdw==",
"engines": {
"node": ">=6"
}
},
"node_modules/fastq": {
"version": "1.15.0",
"license": "ISC",
@@ -6847,6 +6923,25 @@
"postcss": "^8.1.0"
}
},
"node_modules/ieee754": {
"version": "1.2.1",
"resolved": "https://registry.npmjs.org/ieee754/-/ieee754-1.2.1.tgz",
"integrity": "sha512-dcyqhDvX1C46lXZcVqCpK+FtMRQVdIMN6/Df5js2zouUsqG7I6sFxitIC+7KYK29KdXOLHdu9zL4sFnoVQnqaA==",
"funding": [
{
"type": "github",
"url": "https://github.com/sponsors/feross"
},
{
"type": "patreon",
"url": "https://www.patreon.com/feross"
},
{
"type": "consulting",
"url": "https://feross.org/support"
}
]
},
"node_modules/ignore": {
"version": "5.2.4",
"dev": true,
@@ -9029,6 +9124,11 @@
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/on-exit-leak-free": {
"version": "2.1.0",
"resolved": "https://registry.npmjs.org/on-exit-leak-free/-/on-exit-leak-free-2.1.0.tgz",
"integrity": "sha512-VuCaZZAjReZ3vUwgOB8LxAosIurDiAW0s13rI1YwmaP++jvcxP77AWoQvenZebpCA2m8WC1/EosPYPMjnRAp/w=="
},
"node_modules/once": {
"version": "1.4.0",
"license": "ISC",
@@ -9234,6 +9334,55 @@
"node": ">=6"
}
},
"node_modules/pino": {
"version": "8.12.1",
"resolved": "https://registry.npmjs.org/pino/-/pino-8.12.1.tgz",
"integrity": "sha512-n4F7nrEUDH0u5pjlUPkrvsN6oZOhScZWwpWPniwVkMnO6a2DhpnMNWDUlW0SawVICgrNhOf0yTNgw2lYQBsPYA==",
"dependencies": {
"atomic-sleep": "^1.0.0",
"fast-redact": "^3.1.1",
"on-exit-leak-free": "^2.1.0",
"pino-abstract-transport": "v1.0.0",
"pino-std-serializers": "^6.0.0",
"process-warning": "^2.0.0",
"quick-format-unescaped": "^4.0.3",
"real-require": "^0.2.0",
"safe-stable-stringify": "^2.3.1",
"sonic-boom": "^3.1.0",
"thread-stream": "^2.0.0"
},
"bin": {
"pino": "bin.js"
}
},
"node_modules/pino-abstract-transport": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/pino-abstract-transport/-/pino-abstract-transport-1.0.0.tgz",
"integrity": "sha512-c7vo5OpW4wIS42hUVcT5REsL8ZljsUfBjqV/e2sFxmFEFZiq1XLUp5EYLtuDH6PEHq9W1egWqRbnLUP5FuZmOA==",
"dependencies": {
"readable-stream": "^4.0.0",
"split2": "^4.0.0"
}
},
"node_modules/pino-abstract-transport/node_modules/readable-stream": {
"version": "4.3.0",
"resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-4.3.0.tgz",
"integrity": "sha512-MuEnA0lbSi7JS8XM+WNJlWZkHAAdm7gETHdFK//Q/mChGyj2akEFtdLZh32jSdkWGbRwCW9pn6g3LWDdDeZnBQ==",
"dependencies": {
"abort-controller": "^3.0.0",
"buffer": "^6.0.3",
"events": "^3.3.0",
"process": "^0.11.10"
},
"engines": {
"node": "^12.22.0 || ^14.17.0 || >=16.0.0"
}
},
"node_modules/pino-std-serializers": {
"version": "6.2.1",
"resolved": "https://registry.npmjs.org/pino-std-serializers/-/pino-std-serializers-6.2.1.tgz",
"integrity": "sha512-wHuWB+CvSVb2XqXM0W/WOYUkVSPbiJb9S5fNB7TBhd8s892Xq910bRxwHtC4l71hgztObTjXL6ZheZXFjhDrDQ=="
},
"node_modules/pirates": {
"version": "4.0.5",
"license": "MIT",
@@ -10163,12 +10312,16 @@
},
"node_modules/process": {
"version": "0.11.10",
"dev": true,
"license": "MIT",
"engines": {
"node": ">= 0.6.0"
}
},
"node_modules/process-warning": {
"version": "2.2.0",
"resolved": "https://registry.npmjs.org/process-warning/-/process-warning-2.2.0.tgz",
"integrity": "sha512-/1WZ8+VQjR6avWOgHeEPd7SDQmFQ1B5mC1eRXsCm5TarlNmx/wCsa5GEaxGm05BORRtyG/Ex/3xq3TuRvq57qg=="
},
"node_modules/prop-types": {
"version": "15.8.1",
"license": "MIT",
@@ -10242,6 +10395,11 @@
],
"license": "MIT"
},
"node_modules/quick-format-unescaped": {
"version": "4.0.4",
"resolved": "https://registry.npmjs.org/quick-format-unescaped/-/quick-format-unescaped-4.0.4.tgz",
"integrity": "sha512-tYC1Q1hgyRuHgloV/YXs2w15unPVh8qfu/qCTfhTYamaw7fyhumKa2yGpdSo87vY32rIclj+4fWYQXUMs9EHvg=="
},
"node_modules/quick-lru": {
"version": "5.1.1",
"license": "MIT",
@@ -10546,6 +10704,14 @@
"node": ">=8.10.0"
}
},
"node_modules/real-require": {
"version": "0.2.0",
"resolved": "https://registry.npmjs.org/real-require/-/real-require-0.2.0.tgz",
"integrity": "sha512-57frrGM/OCTLqLOAh0mhVA9VBMHd+9U7Zb2THMGdBUoZVOtGbJzjxsYGDJ3A9AYYCP4hn6y1TVbaOfzWtm5GFg==",
"engines": {
"node": ">= 12.13.0"
}
},
"node_modules/recoil": {
"version": "0.7.7",
"license": "MIT",
@@ -10916,6 +11082,14 @@
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/safe-stable-stringify": {
"version": "2.4.3",
"resolved": "https://registry.npmjs.org/safe-stable-stringify/-/safe-stable-stringify-2.4.3.tgz",
"integrity": "sha512-e2bDA2WJT0wxseVd4lsDP4+3ONX6HpMXQa1ZhFQ7SU+GjvORCmShbCMltrtIDfkYhVHrOcPtj+KhmDBdPdZD1g==",
"engines": {
"node": ">=10"
}
},
"node_modules/safer-buffer": {
"version": "2.1.2",
"license": "MIT"
@@ -11044,6 +11218,14 @@
"node": ">=6"
}
},
"node_modules/sonic-boom": {
"version": "3.3.0",
"resolved": "https://registry.npmjs.org/sonic-boom/-/sonic-boom-3.3.0.tgz",
"integrity": "sha512-LYxp34KlZ1a2Jb8ZQgFCK3niIHzibdwtwNUWKg0qQRzsDoJ3Gfgkf8KdBTFU3SkejDEIlWwnSnpVdOZIhFMl/g==",
"dependencies": {
"atomic-sleep": "^1.0.0"
}
},
"node_modules/source-map": {
"version": "0.6.1",
"dev": true,
@@ -11116,6 +11298,14 @@
"url": "https://github.com/sponsors/wooorm"
}
},
"node_modules/split2": {
"version": "4.2.0",
"resolved": "https://registry.npmjs.org/split2/-/split2-4.2.0.tgz",
"integrity": "sha512-UcjcJOWknrNkF6PLX83qcHM6KHgVKNkV62Y8a5uYDVv9ydGQVwAHMKqHdJje1VTWpljG0WYpCDhrCdAOYH4TWg==",
"engines": {
"node": ">= 10.x"
}
},
"node_modules/stack-utils": {
"version": "2.0.6",
"resolved": "https://registry.npmjs.org/stack-utils/-/stack-utils-2.0.6.tgz",
@@ -11525,6 +11715,14 @@
"node": ">=0.8"
}
},
"node_modules/thread-stream": {
"version": "2.3.0",
"resolved": "https://registry.npmjs.org/thread-stream/-/thread-stream-2.3.0.tgz",
"integrity": "sha512-kaDqm1DET9pp3NXwR8382WHbnpXnRkN9xGN9dQt3B2+dmXiW8X1SOwmFOxAErEQ47ObhZ96J6yhZNXuyCOL7KA==",
"dependencies": {
"real-require": "^0.2.0"
}
},
"node_modules/to-fast-properties": {
"version": "2.0.0",
"dev": true,
@@ -14815,6 +15013,14 @@
"version": "2.0.6",
"dev": true
},
"abort-controller": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/abort-controller/-/abort-controller-3.0.0.tgz",
"integrity": "sha512-h8lQ8tacZYnR3vNQTgibj+tODHI5/+l06Au2Pcriv/Gmet0eaj4TwWH41sO9wnHDiQsEj19q0drzdWdeAHtweg==",
"requires": {
"event-target-shim": "^5.0.0"
}
},
"acorn": {
"version": "8.8.2",
"dev": true
@@ -14982,6 +15188,11 @@
"asynckit": {
"version": "0.4.0"
},
"atomic-sleep": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/atomic-sleep/-/atomic-sleep-1.0.0.tgz",
"integrity": "sha512-kNOjDqAh7px0XWNI+4QbzoiR/nTkHAWNud2uvnJquD1/x5a7EQZMJT0AczqK0Qn67oY/TTQ1LbUKajZpp3I9tQ=="
},
"autoprefixer": {
"version": "10.4.14",
"dev": true,
@@ -15168,6 +15379,11 @@
"resolved": "https://registry.npmjs.org/base64-arraybuffer/-/base64-arraybuffer-1.0.2.tgz",
"integrity": "sha512-I3yl4r9QB5ZRY3XuJVEPfc2XhZO6YweFPI+UovAzn+8/hb3oJ6lnysaFcjVpkCPfVWFUDvoZ8kmVDP7WyRtYtQ=="
},
"base64-js": {
"version": "1.5.1",
"resolved": "https://registry.npmjs.org/base64-js/-/base64-js-1.5.1.tgz",
"integrity": "sha512-AKpaYlHn8t4SVbOHCy+b5+KKgvR4vrsD8vbvrbiQJps7fKDTkjkDry6ji0rUJjC0kzbNePLwzxq8iypo41qeWA=="
},
"big.js": {
"version": "5.2.2",
"dev": true
@@ -15257,6 +15473,15 @@
"update-browserslist-db": "^1.0.10"
}
},
"buffer": {
"version": "6.0.3",
"resolved": "https://registry.npmjs.org/buffer/-/buffer-6.0.3.tgz",
"integrity": "sha512-FTiCpNxtwiZZHEZbcbTIcZjERVICn9yq/pDFkTl95/AxzD1naBctN7YO68riM/gLSDY7sdrMby8hofADYuuqOA==",
"requires": {
"base64-js": "^1.3.1",
"ieee754": "^1.2.1"
}
},
"buffer-from": {
"version": "1.1.2",
"dev": true
@@ -16220,10 +16445,13 @@
"version": "2.0.3",
"dev": true
},
"event-target-shim": {
"version": "5.0.1",
"resolved": "https://registry.npmjs.org/event-target-shim/-/event-target-shim-5.0.1.tgz",
"integrity": "sha512-i/2XbnSz/uxRCU6+NdVJgKWDTM427+MqYbkQzD321DuCQJUqOuJKIA0IM2+W2xtYHdKOmZ4dR6fExsd4SXL+WQ=="
},
"events": {
"version": "3.3.0",
"dev": true,
"peer": true
"version": "3.3.0"
},
"evp_bytestokey": {
"version": "1.0.3",
@@ -16273,6 +16501,11 @@
"version": "2.0.6",
"dev": true
},
"fast-redact": {
"version": "3.1.2",
"resolved": "https://registry.npmjs.org/fast-redact/-/fast-redact-3.1.2.tgz",
"integrity": "sha512-+0em+Iya9fKGfEQGcd62Yv6onjBmmhV1uh86XVfOU8VwAe6kaFdQCWI9s0/Nnugx5Vd9tdbZ7e6gE2tR9dzXdw=="
},
"fastq": {
"version": "1.15.0",
"requires": {
@@ -16713,6 +16946,11 @@
"dev": true,
"requires": {}
},
"ieee754": {
"version": "1.2.1",
"resolved": "https://registry.npmjs.org/ieee754/-/ieee754-1.2.1.tgz",
"integrity": "sha512-dcyqhDvX1C46lXZcVqCpK+FtMRQVdIMN6/Df5js2zouUsqG7I6sFxitIC+7KYK29KdXOLHdu9zL4sFnoVQnqaA=="
},
"ignore": {
"version": "5.2.4",
"dev": true
@@ -18010,6 +18248,11 @@
"es-abstract": "^1.20.4"
}
},
"on-exit-leak-free": {
"version": "2.1.0",
"resolved": "https://registry.npmjs.org/on-exit-leak-free/-/on-exit-leak-free-2.1.0.tgz",
"integrity": "sha512-VuCaZZAjReZ3vUwgOB8LxAosIurDiAW0s13rI1YwmaP++jvcxP77AWoQvenZebpCA2m8WC1/EosPYPMjnRAp/w=="
},
"once": {
"version": "1.4.0",
"requires": {
@@ -18142,6 +18385,51 @@
"version": "4.0.1",
"dev": true
},
"pino": {
"version": "8.12.1",
"resolved": "https://registry.npmjs.org/pino/-/pino-8.12.1.tgz",
"integrity": "sha512-n4F7nrEUDH0u5pjlUPkrvsN6oZOhScZWwpWPniwVkMnO6a2DhpnMNWDUlW0SawVICgrNhOf0yTNgw2lYQBsPYA==",
"requires": {
"atomic-sleep": "^1.0.0",
"fast-redact": "^3.1.1",
"on-exit-leak-free": "^2.1.0",
"pino-abstract-transport": "v1.0.0",
"pino-std-serializers": "^6.0.0",
"process-warning": "^2.0.0",
"quick-format-unescaped": "^4.0.3",
"real-require": "^0.2.0",
"safe-stable-stringify": "^2.3.1",
"sonic-boom": "^3.1.0",
"thread-stream": "^2.0.0"
}
},
"pino-abstract-transport": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/pino-abstract-transport/-/pino-abstract-transport-1.0.0.tgz",
"integrity": "sha512-c7vo5OpW4wIS42hUVcT5REsL8ZljsUfBjqV/e2sFxmFEFZiq1XLUp5EYLtuDH6PEHq9W1egWqRbnLUP5FuZmOA==",
"requires": {
"readable-stream": "^4.0.0",
"split2": "^4.0.0"
},
"dependencies": {
"readable-stream": {
"version": "4.3.0",
"resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-4.3.0.tgz",
"integrity": "sha512-MuEnA0lbSi7JS8XM+WNJlWZkHAAdm7gETHdFK//Q/mChGyj2akEFtdLZh32jSdkWGbRwCW9pn6g3LWDdDeZnBQ==",
"requires": {
"abort-controller": "^3.0.0",
"buffer": "^6.0.3",
"events": "^3.3.0",
"process": "^0.11.10"
}
}
}
},
"pino-std-serializers": {
"version": "6.2.1",
"resolved": "https://registry.npmjs.org/pino-std-serializers/-/pino-std-serializers-6.2.1.tgz",
"integrity": "sha512-wHuWB+CvSVb2XqXM0W/WOYUkVSPbiJb9S5fNB7TBhd8s892Xq910bRxwHtC4l71hgztObTjXL6ZheZXFjhDrDQ=="
},
"pirates": {
"version": "4.0.5"
},
@@ -18568,8 +18856,12 @@
}
},
"process": {
"version": "0.11.10",
"dev": true
"version": "0.11.10"
},
"process-warning": {
"version": "2.2.0",
"resolved": "https://registry.npmjs.org/process-warning/-/process-warning-2.2.0.tgz",
"integrity": "sha512-/1WZ8+VQjR6avWOgHeEPd7SDQmFQ1B5mC1eRXsCm5TarlNmx/wCsa5GEaxGm05BORRtyG/Ex/3xq3TuRvq57qg=="
},
"prop-types": {
"version": "15.8.1",
@@ -18616,6 +18908,11 @@
"queue-microtask": {
"version": "1.2.3"
},
"quick-format-unescaped": {
"version": "4.0.4",
"resolved": "https://registry.npmjs.org/quick-format-unescaped/-/quick-format-unescaped-4.0.4.tgz",
"integrity": "sha512-tYC1Q1hgyRuHgloV/YXs2w15unPVh8qfu/qCTfhTYamaw7fyhumKa2yGpdSo87vY32rIclj+4fWYQXUMs9EHvg=="
},
"quick-lru": {
"version": "5.1.1"
},
@@ -18786,6 +19083,11 @@
"picomatch": "^2.2.1"
}
},
"real-require": {
"version": "0.2.0",
"resolved": "https://registry.npmjs.org/real-require/-/real-require-0.2.0.tgz",
"integrity": "sha512-57frrGM/OCTLqLOAh0mhVA9VBMHd+9U7Zb2THMGdBUoZVOtGbJzjxsYGDJ3A9AYYCP4hn6y1TVbaOfzWtm5GFg=="
},
"recoil": {
"version": "0.7.7",
"requires": {
@@ -19002,6 +19304,11 @@
"is-regex": "^1.1.4"
}
},
"safe-stable-stringify": {
"version": "2.4.3",
"resolved": "https://registry.npmjs.org/safe-stable-stringify/-/safe-stable-stringify-2.4.3.tgz",
"integrity": "sha512-e2bDA2WJT0wxseVd4lsDP4+3ONX6HpMXQa1ZhFQ7SU+GjvORCmShbCMltrtIDfkYhVHrOcPtj+KhmDBdPdZD1g=="
},
"safer-buffer": {
"version": "2.1.2"
},
@@ -19087,6 +19394,14 @@
"version": "2.0.0",
"dev": true
},
"sonic-boom": {
"version": "3.3.0",
"resolved": "https://registry.npmjs.org/sonic-boom/-/sonic-boom-3.3.0.tgz",
"integrity": "sha512-LYxp34KlZ1a2Jb8ZQgFCK3niIHzibdwtwNUWKg0qQRzsDoJ3Gfgkf8KdBTFU3SkejDEIlWwnSnpVdOZIhFMl/g==",
"requires": {
"atomic-sleep": "^1.0.0"
}
},
"source-map": {
"version": "0.6.1",
"dev": true
@@ -19128,6 +19443,11 @@
"space-separated-tokens": {
"version": "2.0.2"
},
"split2": {
"version": "4.2.0",
"resolved": "https://registry.npmjs.org/split2/-/split2-4.2.0.tgz",
"integrity": "sha512-UcjcJOWknrNkF6PLX83qcHM6KHgVKNkV62Y8a5uYDVv9ydGQVwAHMKqHdJje1VTWpljG0WYpCDhrCdAOYH4TWg=="
},
"stack-utils": {
"version": "2.0.6",
"resolved": "https://registry.npmjs.org/stack-utils/-/stack-utils-2.0.6.tgz",
@@ -19389,6 +19709,14 @@
"thenify": ">= 3.1.0 < 4"
}
},
"thread-stream": {
"version": "2.3.0",
"resolved": "https://registry.npmjs.org/thread-stream/-/thread-stream-2.3.0.tgz",
"integrity": "sha512-kaDqm1DET9pp3NXwR8382WHbnpXnRkN9xGN9dQt3B2+dmXiW8X1SOwmFOxAErEQ47ObhZ96J6yhZNXuyCOL7KA==",
"requires": {
"real-require": "^0.2.0"
}
},
"to-fast-properties": {
"version": "2.0.0",
"dev": true
@@ -19929,4 +20257,4 @@
"version": "2.0.4"
}
}
}
}

View File

@@ -1,6 +1,6 @@
{
"name": "chatgpt-clone",
"version": "0.4.1",
"name": "chat-frontend",
"version": "0.4.4",
"description": "",
"type": "module",
"scripts": {
@@ -53,6 +53,7 @@
"html2canvas": "^1.4.1",
"lodash": "^4.17.21",
"lucide-react": "^0.113.0",
"pino": "^8.12.1",
"rc-input-number": "^7.4.2",
"react": "^18.2.0",
"react-dom": "^18.2.0",

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.9 KiB

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

View File

@@ -126,7 +126,7 @@ export default function Conversation({ conversation, retainView }) {
/>
</div>
) : (
<div className="absolute inset-y-0 right-0 z-10 w-8 bg-gradient-to-l from-gray-900 group-hover:from-[#2A2B32]" />
<div className="absolute inset-y-0 right-0 z-10 w-8 bg-gradient-to-l from-gray-900 group-hover:from-[#2A2B32] rounded-r-md" />
)}
</a>
);

View File

@@ -40,7 +40,7 @@ function Settings(props) {
return (
<>
<div className="max-h-[350px] overflow-y-auto">
<div className="grid gap-6 sm:grid-cols-2">
<div className="col-span-1 flex flex-col items-center justify-start gap-6">
<div className="grid w-full items-center gap-2">
@@ -141,7 +141,7 @@ function Settings(props) {
)}
</div>
</div>
</>
</div>
);
}

View File

@@ -1,4 +1,6 @@
import React, { useEffect, useState } from 'react';
import Examples from './Google/Examples.jsx';
import MessagesSquared from '~/components/svg/MessagesSquared.jsx';
import { useSetRecoilState, useRecoilValue } from 'recoil';
import filenamify from 'filenamify';
import axios from 'axios';
@@ -7,6 +9,7 @@ import DialogTemplate from '../ui/DialogTemplate';
import { Dialog, DialogClose, DialogButton } from '../ui/Dialog.tsx';
import { Input } from '../ui/Input.tsx';
import { Label } from '../ui/Label.tsx';
import { Button } from '../ui/Button.tsx';
import Dropdown from '../ui/Dropdown';
import { cn } from '~/utils/';
import cleanupPreset from '~/utils/cleanupPreset';
@@ -18,11 +21,14 @@ import store from '~/store';
const EditPresetDialog = ({ open, onOpenChange, preset: _preset, title }) => {
// const [title, setTitle] = useState('My Preset');
const [preset, setPreset] = useState(_preset);
const [showExamples, setShowExamples] = useState(false);
const setPresets = useSetRecoilState(store.presets);
const availableEndpoints = useRecoilValue(store.availableEndpoints);
const endpointsConfig = useRecoilValue(store.endpointsConfig);
const triggerExamples = () => setShowExamples(prev => !prev);
const setOption = param => newValue => {
let update = {};
update[param] = newValue;
@@ -37,6 +43,69 @@ const EditPresetDialog = ({ open, onOpenChange, preset: _preset, title }) => {
);
};
const setExample = (i, type, newValue = null) => {
let update = {};
let current = preset?.examples.slice() || [];
let currentExample = { ...current[i] } || {};
currentExample[type] = { content: newValue };
current[i] = currentExample;
update.examples = current;
setPreset(prevState =>
cleanupPreset({
preset: {
...prevState,
...update
},
endpointsConfig
})
);
};
const addExample = () => {
let update = {};
let current = preset?.examples.slice() || [];
current.push({ input: { content: '' }, output: { content: '' } });
update.examples = current;
setPreset(prevState =>
cleanupPreset({
preset: {
...prevState,
...update
},
endpointsConfig
})
);
};
const removeExample = () => {
let update = {};
let current = preset?.examples.slice() || [];
if (current.length <= 1) {
update.examples = [{ input: { content: '' }, output: { content: '' } }];
setPreset(prevState =>
cleanupPreset({
preset: {
...prevState,
...update
},
endpointsConfig
})
);
return;
}
current.pop();
update.examples = current;
setPreset(prevState =>
cleanupPreset({
preset: {
...prevState,
...update
},
endpointsConfig
})
);
};
const defaultTextProps =
'rounded-md border border-gray-200 focus:border-slate-400 focus:bg-gray-50 bg-transparent text-sm shadow-[0_0_10px_rgba(0,0,0,0.05)] outline-none placeholder:text-gray-400 focus:outline-none focus:ring-gray-400 focus:ring-opacity-20 focus:ring-offset-2 disabled:cursor-not-allowed disabled:opacity-50 dark:border-gray-500 dark:bg-gray-700 focus:dark:bg-gray-600 dark:text-gray-50 dark:shadow-[0_0_15px_rgba(0,0,0,0.10)] dark:focus:border-gray-400 dark:focus:outline-none dark:focus:ring-0 dark:focus:ring-gray-400 dark:focus:ring-offset-0';
@@ -111,14 +180,35 @@ const EditPresetDialog = ({ open, onOpenChange, preset: _preset, title }) => {
)}
containerClassName="flex w-full resize-none"
/>
{preset?.endpoint === 'google' && (
<Button
type="button"
className="ml-1 flex h-auto w-full bg-transparent px-2 py-1 text-xs font-medium font-normal text-black hover:bg-slate-200 hover:text-black focus:ring-0 focus:ring-offset-0 dark:bg-transparent dark:text-white dark:hover:bg-gray-700 dark:hover:text-white dark:focus:outline-none dark:focus:ring-offset-0"
onClick={triggerExamples}
>
<MessagesSquared className="mr-1 w-[14px]" />
{(showExamples ? 'Hide' : 'Show') + ' Examples'}
</Button>
)}
</div>
</div>
<div className="my-4 w-full border-t border-gray-300 dark:border-gray-500" />
<div className="w-full p-0">
<Settings
preset={preset}
setOption={setOption}
/>
{((preset?.endpoint === 'google' && !showExamples) || preset?.endpoint !== 'google') && (
<Settings
preset={_preset}
setOption={setOption}
/>
)}
{preset?.endpoint === 'google' && showExamples && (
<Examples
examples={preset.examples}
setExample={setExample}
addExample={addExample}
removeExample={removeExample}
edit={true}
/>
)}
</div>
</div>
}

View File

@@ -12,12 +12,16 @@ import store from '~/store';
// A preset dialog to show readonly preset values.
const EndpointOptionsDialog = ({ open, onOpenChange, preset: _preset, title }) => {
// const [title, setTitle] = useState('My Preset');
const [preset, setPreset] = useState(_preset);
const [endpointName, setEndpointName] = useState(preset?.endpoint);
const [saveAsDialogShow, setSaveAsDialogShow] = useState(false);
const endpointsConfig = useRecoilValue(store.endpointsConfig);
if (endpointName === 'google') {
setEndpointName('PaLM');
}
const setOption = param => newValue => {
let update = {};
update[param] = newValue;
@@ -50,7 +54,7 @@ const EndpointOptionsDialog = ({ open, onOpenChange, preset: _preset, title }) =
onOpenChange={onOpenChange}
>
<DialogTemplate
title={`${title || 'View Options'} - ${preset?.endpoint}`}
title={`${title || 'View Options'} - ${endpointName}`}
className="max-w-full sm:max-w-4xl"
main={
<div className="flex w-full flex-col items-center gap-2">

View File

@@ -4,7 +4,13 @@ import CrossIcon from '../svg/CrossIcon';
// import SaveIcon from '../svg/SaveIcon';
import { Save } from 'lucide-react';
function EndpointOptionsPopover({ content, visible, saveAsPreset, switchToSimpleMode }) {
function EndpointOptionsPopover({
content,
visible,
saveAsPreset,
switchToSimpleMode,
additionalButton = null
}) {
const cardStyle =
'shadow-md rounded-md min-w-[75px] font-normal bg-white border-black/10 border dark:bg-gray-700 text-black dark:text-white';
@@ -12,29 +18,39 @@ function EndpointOptionsPopover({ content, visible, saveAsPreset, switchToSimple
<>
<div
className={
' endpointOptionsPopover-container absolute bottom-[-10px] flex w-full flex-col items-center justify-center md:px-4' +
' endpointOptionsPopover-container absolute bottom-[-10px] flex w-full flex-col items-center md:px-4' +
(visible ? ' show' : '')
}
>
<div
className={
cardStyle +
' border-s-0 border-d-0 flex w-full flex-col overflow-hidden rounded-none border-t bg-slate-200 px-0 pb-[10px] dark:border-white/10 md:rounded-md md:border lg:w-[736px]'
' border-d-0 flex w-full flex-col overflow-hidden rounded-none border-s-0 border-t bg-slate-200 px-0 pb-[10px] dark:border-white/10 md:rounded-md md:border lg:w-[736px]'
}
>
<div className="flex w-full items-center justify-between bg-slate-100 px-2 py-2 dark:bg-gray-800/60">
<div className="flex w-full items-center bg-slate-100 px-2 py-2 dark:bg-gray-800/60">
{/* <span className="text-xs font-medium font-normal">Advanced settings for OpenAI endpoint</span> */}
<Button
type="button"
className="h-auto bg-transparent px-2 py-1 text-xs font-medium font-normal text-black hover:bg-slate-200 hover:text-black dark:bg-transparent dark:text-white dark:hover:bg-gray-700 dark:hover:text-white"
className="h-auto justify-start bg-transparent px-2 py-1 text-xs font-medium font-normal text-black hover:bg-slate-200 hover:text-black focus:ring-0 dark:bg-transparent dark:text-white dark:hover:bg-gray-700 dark:hover:text-white dark:focus:outline-none dark:focus:ring-offset-0"
onClick={saveAsPreset}
>
<Save className="mr-1 w-[14px]" />
Save as preset
</Button>
{additionalButton && (
<Button
type="button"
className="ml-1 h-auto justify-start bg-transparent px-2 py-1 text-xs font-medium font-normal text-black hover:bg-slate-200 hover:text-black focus:ring-0 focus:ring-offset-0 dark:bg-transparent dark:text-white dark:hover:bg-gray-700 dark:hover:text-white dark:focus:outline-none dark:focus:ring-offset-0"
onClick={additionalButton.handler}
>
{additionalButton.icon}
{additionalButton.label}
</Button>
)}
<Button
type="button"
className="h-auto bg-transparent px-2 py-1 text-xs font-medium font-normal text-black hover:bg-slate-200 hover:text-black dark:bg-transparent dark:text-white dark:hover:bg-gray-700 dark:hover:text-white"
className="ml-auto h-auto bg-transparent px-2 py-1 text-xs font-medium font-normal text-black hover:bg-slate-200 hover:text-black focus:ring-offset-0 dark:bg-transparent dark:text-white dark:hover:bg-gray-700 dark:hover:text-white"
onClick={switchToSimpleMode}
>
<CrossIcon className="mr-1" />

View File

@@ -0,0 +1,98 @@
import React from 'react';
import TextareaAutosize from 'react-textarea-autosize';
import { Button } from '~/components/ui/Button.tsx';
import { Label } from '~/components/ui/Label.tsx';
import { Plus, Minus } from 'lucide-react';
import { cn } from '~/utils/';
const defaultTextProps =
'rounded-md border border-gray-200 focus:border-slate-400 focus:bg-gray-50 bg-transparent text-sm shadow-[0_0_10px_rgba(0,0,0,0.05)] outline-none placeholder:text-gray-400 focus:outline-none focus:ring-gray-400 focus:ring-opacity-20 focus:ring-offset-2 disabled:cursor-not-allowed disabled:opacity-50 dark:border-gray-500 dark:bg-gray-700 focus:dark:bg-gray-600 dark:text-gray-50 dark:shadow-[0_0_15px_rgba(0,0,0,0.10)] dark:focus:border-gray-400 dark:focus:outline-none dark:focus:ring-0 dark:focus:ring-gray-400 dark:focus:ring-offset-0';
function Examples({ readonly, examples, setExample, addExample, removeExample, edit = false }) {
const maxHeight = edit ? 'max-h-[233px]' : 'max-h-[350px]';
return (
<>
<div className={`${maxHeight} overflow-y-auto`}>
<div
id="examples-grid"
className="grid gap-6 sm:grid-cols-2"
>
{examples.map((example, idx) => (
<React.Fragment key={idx}>
{/* Input */}
<div
className={`col-span-${
examples.length === 1 ? '1' : 'full'
} flex flex-col items-center justify-start gap-6 sm:col-span-1`}
>
<div className="grid w-full items-center gap-2">
<Label
htmlFor={`input-${idx}`}
className="text-left text-sm font-medium"
>
Input <small className="opacity-40">(default: blank)</small>
</Label>
<TextareaAutosize
id={`input-${idx}`}
disabled={readonly}
value={example?.input?.content || ''}
onChange={e => setExample(idx, 'input', e.target.value || null)}
placeholder="Set example input. Example is ignored if empty."
className={cn(
defaultTextProps,
'flex max-h-[300px] min-h-[75px] w-full resize-none px-3 py-2 '
)}
/>
</div>
</div>
{/* Output */}
<div
className={`col-span-${
examples.length === 1 ? '1' : 'full'
} flex flex-col items-center justify-start gap-6 sm:col-span-1`}
>
<div className="grid w-full items-center gap-2">
<Label
htmlFor={`output-${idx}`}
className="text-left text-sm font-medium"
>
Output <small className="opacity-40">(default: blank)</small>
</Label>
<TextareaAutosize
id={`output-${idx}`}
disabled={readonly}
value={example?.output?.content || ''}
onChange={e => setExample(idx, 'output', e.target.value || null)}
placeholder={`Set example output. Example is ignored if empty.`}
className={cn(
defaultTextProps,
'flex max-h-[300px] min-h-[75px] w-full resize-none px-3 py-2 '
)}
/>
</div>
</div>
</React.Fragment>
))}
</div>
</div>
<div className="flex justify-center">
<Button
type="button"
className="mr-2 mt-1 h-auto items-center justify-center bg-transparent px-3 py-2 text-xs font-medium font-normal text-black hover:bg-slate-200 hover:text-black focus:ring-0 focus:ring-offset-0 dark:bg-transparent dark:text-white dark:hover:bg-gray-600 dark:hover:text-white dark:focus:outline-none dark:focus:ring-offset-0"
onClick={removeExample}
>
<Minus className="w-[16px]" />
</Button>
<Button
type="button"
className="mt-1 h-auto items-center justify-center bg-transparent px-3 py-2 text-xs font-medium font-normal text-black hover:bg-slate-200 hover:text-black focus:ring-0 focus:ring-offset-0 dark:bg-transparent dark:text-white dark:hover:bg-gray-600 dark:hover:text-white dark:focus:outline-none dark:focus:ring-offset-0"
onClick={addExample}
>
<Plus className="w-[16px]" />
</Button>
</div>
</>
);
}
export default Examples;

View File

@@ -0,0 +1,32 @@
import React from 'react';
import { HoverCardPortal, HoverCardContent } from '~/components/ui/HoverCard.tsx';
const types = {
temp: 'Higher values = more random, while lower values = more focused and deterministic. We recommend altering this or Top P but not both.',
topp: 'Top-p changes how the model selects tokens for output. Tokens are selected from most K (see topK parameter) probable to least until the sum of their probabilities equals the top-p value.',
topk: "Top-k changes how the model selects tokens for output. A top-k of 1 means the selected token is the most probable among all tokens in the model's vocabulary (also called greedy decoding), while a top-k of 3 means that the next token is selected from among the 3 most probable tokens (using temperature).",
maxoutputtokens: " Maximum number of tokens that can be generated in the response. Specify a lower value for shorter responses and a higher value for longer responses."
};
function OptionHover({ type, side }) {
// const options = {};
// if (type === 'pres') {
// options.sideOffset = 45;
// }
return (
<HoverCardPortal>
<HoverCardContent
side={side}
className="w-80 "
// {...options}
>
<div className="space-y-2">
<p className="text-sm text-gray-600 dark:text-gray-300">{types[type]}</p>
</div>
</HoverCardContent>
</HoverCardPortal>
);
}
export default OptionHover;

View File

@@ -0,0 +1,271 @@
import { useRecoilValue } from 'recoil';
import TextareaAutosize from 'react-textarea-autosize';
import SelectDropDown from '../../ui/SelectDropDown';
import { Input } from '~/components/ui/Input.tsx';
import { Label } from '~/components/ui/Label.tsx';
import { Slider } from '~/components/ui/Slider.tsx';
import { InputNumber } from '~/components/ui/InputNumber.tsx';
import OptionHover from './OptionHover';
import { HoverCard, HoverCardTrigger } from '~/components/ui/HoverCard.tsx';
import { cn } from '~/utils/';
const defaultTextProps =
'rounded-md border border-gray-200 focus:border-slate-400 focus:bg-gray-50 bg-transparent text-sm shadow-[0_0_10px_rgba(0,0,0,0.05)] outline-none placeholder:text-gray-400 focus:outline-none focus:ring-gray-400 focus:ring-opacity-20 focus:ring-offset-2 disabled:cursor-not-allowed disabled:opacity-50 dark:border-gray-500 dark:bg-gray-700 focus:dark:bg-gray-600 dark:text-gray-50 dark:shadow-[0_0_15px_rgba(0,0,0,0.10)] dark:focus:border-gray-400 dark:focus:outline-none dark:focus:ring-0 dark:focus:ring-gray-400 dark:focus:ring-offset-0';
const optionText =
'p-0 shadow-none text-right pr-1 h-8 border-transparent focus:ring-[#10a37f] focus:ring-offset-0 focus:ring-opacity-100 hover:bg-gray-800/10 dark:hover:bg-white/10 focus:bg-gray-800/10 dark:focus:bg-white/10 transition-colors';
import store from '~/store';
function Settings(props) {
const { readonly, model, modelLabel, promptPrefix, temperature, topP, topK, maxOutputTokens, setOption, edit = false } = props;
const maxHeight = edit ? 'max-h-[233px]' : 'max-h-[350px]';
const endpointsConfig = useRecoilValue(store.endpointsConfig);
const setModel = setOption('model');
const setModelLabel = setOption('modelLabel');
const setPromptPrefix = setOption('promptPrefix');
const setTemperature = setOption('temperature');
const setTopP = setOption('topP');
const setTopK = setOption('topK');
const setMaxOutputTokens = setOption('maxOutputTokens');
const models = endpointsConfig?.['google']?.['availableModels'] || [];
return (
<div className={`${maxHeight} overflow-y-auto`}>
<div className="grid gap-6 sm:grid-cols-2">
<div className="col-span-1 flex flex-col items-center justify-start gap-6">
<div className="grid w-full items-center gap-2">
<SelectDropDown
value={model}
setValue={setModel}
availableValues={models}
disabled={readonly}
className={cn(
defaultTextProps,
'flex w-full resize-none focus:outline-none focus:ring-0 focus:ring-opacity-0 focus:ring-offset-0'
)}
containerClassName="flex w-full resize-none"
/>
</div>
<div className="grid w-full items-center gap-2">
<Label
htmlFor="modelLabel"
className="text-left text-sm font-medium"
>
Custom Name <small className="opacity-40">(default: blank)</small>
</Label>
<Input
id="modelLabel"
disabled={readonly}
value={modelLabel || ''}
onChange={e => setModelLabel(e.target.value || null)}
placeholder="Set a custom name for PaLM2"
className={cn(
defaultTextProps,
'flex h-10 max-h-10 w-full resize-none px-3 py-2 focus:outline-none focus:ring-0 focus:ring-opacity-0 focus:ring-offset-0'
)}
/>
</div>
<div className="grid w-full items-center gap-2">
<Label
htmlFor="promptPrefix"
className="text-left text-sm font-medium"
>
Prompt Prefix <small className="opacity-40">(default: blank)</small>
</Label>
<TextareaAutosize
id="promptPrefix"
disabled={readonly}
value={promptPrefix || ''}
onChange={e => setPromptPrefix(e.target.value || null)}
placeholder="Set custom instructions or context. Ignored if empty."
className={cn(
defaultTextProps,
'flex max-h-[300px] min-h-[100px] w-full resize-none px-3 py-2 '
)}
/>
</div>
</div>
<div className="col-span-1 flex flex-col items-center justify-start gap-6">
<HoverCard openDelay={300}>
<HoverCardTrigger className="grid w-full items-center gap-2">
<div className="flex justify-between">
<Label
htmlFor="temp-int"
className="text-left text-sm font-medium"
>
Temperature <small className="opacity-40">(default: 0.2)</small>
</Label>
<InputNumber
id="temp-int"
disabled={readonly}
value={temperature}
onChange={value => setTemperature(value)}
max={1}
min={0}
step={0.01}
controls={false}
className={cn(
defaultTextProps,
cn(
optionText,
'reset-rc-number-input reset-rc-number-input-text-right h-auto w-12 border-0 group-hover/temp:border-gray-200'
)
)}
/>
</div>
<Slider
disabled={readonly}
value={[temperature]}
onValueChange={value => setTemperature(value[0])}
doubleClickHandler={() => setTemperature(1)}
max={1}
min={0}
step={0.01}
className="flex h-4 w-full"
/>
</HoverCardTrigger>
<OptionHover
type="temp"
side="left"
/>
</HoverCard>
<HoverCard openDelay={300}>
<HoverCardTrigger className="grid w-full items-center gap-2">
<div className="flex justify-between">
<Label
htmlFor="top-p-int"
className="text-left text-sm font-medium"
>
Top P <small className="opacity-40">(default: 0.95)</small>
</Label>
<InputNumber
id="top-p-int"
disabled={readonly}
value={topP}
onChange={value => setTopP(value)}
max={1}
min={0}
step={0.01}
controls={false}
className={cn(
defaultTextProps,
cn(
optionText,
'reset-rc-number-input reset-rc-number-input-text-right h-auto w-12 border-0 group-hover/temp:border-gray-200'
)
)}
/>
</div>
<Slider
disabled={readonly}
value={[topP]}
onValueChange={value => setTopP(value[0])}
doubleClickHandler={() => setTopP(1)}
max={1}
min={0}
step={0.01}
className="flex h-4 w-full"
/>
</HoverCardTrigger>
<OptionHover
type="topp"
side="left"
/>
</HoverCard>
<HoverCard openDelay={300}>
<HoverCardTrigger className="grid w-full items-center gap-2">
<div className="flex justify-between">
<Label
htmlFor="top-k-int"
className="text-left text-sm font-medium"
>
Top K <small className="opacity-40">(default: 40)</small>
</Label>
<InputNumber
id="top-k-int"
disabled={readonly}
value={topK}
onChange={value => setTopK(value)}
max={40}
min={1}
step={0.01}
controls={false}
className={cn(
defaultTextProps,
cn(
optionText,
'reset-rc-number-input reset-rc-number-input-text-right h-auto w-12 border-0 group-hover/temp:border-gray-200'
)
)}
/>
</div>
<Slider
disabled={readonly}
value={[topK]}
onValueChange={value => setTopK(value[0])}
doubleClickHandler={() => setTopK(0)}
max={40}
min={1}
step={0.01}
className="flex h-4 w-full"
/>
</HoverCardTrigger>
<OptionHover
type="topk"
side="left"
/>
</HoverCard>
<HoverCard openDelay={300}>
<HoverCardTrigger className="grid w-full items-center gap-2">
<div className="flex justify-between">
<Label
htmlFor="max-tokens-int"
className="text-left text-sm font-medium"
>
Max Output Tokens <small className="opacity-40">(default: 1024)</small>
</Label>
<InputNumber
id="max-tokens-int"
disabled={readonly}
value={maxOutputTokens}
onChange={value => setMaxOutputTokens(value)}
max={1024}
min={1}
step={1}
controls={false}
className={cn(
defaultTextProps,
cn(
optionText,
'reset-rc-number-input reset-rc-number-input-text-right h-auto w-12 border-0 group-hover/temp:border-gray-200'
)
)}
/>
</div>
<Slider
disabled={readonly}
value={[maxOutputTokens]}
onValueChange={value => setMaxOutputTokens(value[0])}
doubleClickHandler={() => setMaxOutputTokens(0)}
max={1024}
min={1}
step={1}
className="flex h-4 w-full"
/>
</HoverCardTrigger>
<OptionHover
type="maxoutputtokens"
side="left"
/>
</HoverCard>
</div>
</div>
</div>
);
}
export default Settings;

View File

@@ -32,7 +32,7 @@ function Settings(props) {
const models = endpointsConfig?.['openAI']?.['availableModels'] || [];
return (
<>
<div className="max-h-[350px] overflow-y-auto">
<div className="grid gap-6 sm:grid-cols-2">
<div className="col-span-1 flex flex-col items-center justify-start gap-6">
<div className="grid w-full items-center gap-2">
@@ -264,7 +264,7 @@ function Settings(props) {
</HoverCard>
</div>
</div>
</>
</div>
);
}

View File

@@ -2,13 +2,15 @@ import React from 'react';
import OpenAISettings from './OpenAI/Settings.jsx';
import BingAISettings from './BingAI/Settings.jsx';
import GoogleSettings from './Google/Settings.jsx';
// A preset dialog to show readonly preset values.
const Settings = ({ preset, ...props }) => {
const renderSettings = () => {
const { endpoint } = preset || {};
// console.log('preset', preset);
if (endpoint === 'openAI')
if (endpoint === 'openAI') {
return (
<OpenAISettings
model={preset?.model}
@@ -21,7 +23,7 @@ const Settings = ({ preset, ...props }) => {
{...props}
/>
);
else if (endpoint === 'bingAI')
} else if (endpoint === 'bingAI') {
return (
<BingAISettings
toneStyle={preset?.toneStyle}
@@ -31,7 +33,24 @@ const Settings = ({ preset, ...props }) => {
{...props}
/>
);
else return <div className="text-black dark:text-white">Not implemented</div>;
} else if (endpoint === 'google') {
return (
<GoogleSettings
model={preset?.model}
modelLabel={preset?.modelLabel}
promptPrefix={preset?.promptPrefix}
examples={preset?.examples}
temperature={preset?.temperature}
topP={preset?.topP}
topK={preset?.topK}
maxOutputTokens={preset?.maxOutputTokens}
edit={true}
{...props}
/>
);
} else {
return <div className="text-black dark:text-white">Not implemented</div>;
}
};
return renderSettings();

View File

@@ -0,0 +1,170 @@
import { useState } from 'react';
import { Settings2 } from 'lucide-react';
import { useRecoilState, useRecoilValue } from 'recoil';
import MessagesSquared from '~/components/svg/MessagesSquared.jsx';
import SelectDropDown from '../../ui/SelectDropDown';
import EndpointOptionsPopover from '../../Endpoints/EndpointOptionsPopover';
import SaveAsPresetDialog from '../../Endpoints/SaveAsPresetDialog';
import { Button } from '../../ui/Button.tsx';
import Settings from '../../Endpoints/Google/Settings.jsx';
import Examples from '../../Endpoints/Google/Examples.jsx';
import { cn } from '~/utils/';
import store from '~/store';
function GoogleOptions() {
const [advancedMode, setAdvancedMode] = useState(false);
const [showExamples, setShowExamples] = useState(false);
const [saveAsDialogShow, setSaveAsDialogShow] = useState(false);
const [conversation, setConversation] = useRecoilState(store.conversation) || {};
const { endpoint, conversationId } = conversation;
const { model, modelLabel, promptPrefix, examples, temperature, topP, topK, maxOutputTokens } =
conversation;
const endpointsConfig = useRecoilValue(store.endpointsConfig);
if (endpoint !== 'google') return null;
if (conversationId !== 'new') return null;
const models = endpointsConfig?.['google']?.['availableModels'] || [];
const triggerAdvancedMode = () => setAdvancedMode(prev => !prev);
const triggerExamples = () => setShowExamples(prev => !prev);
const switchToSimpleMode = () => {
setAdvancedMode(false);
};
const saveAsPreset = () => {
setSaveAsDialogShow(true);
};
const setOption = param => newValue => {
let update = {};
update[param] = newValue;
setConversation(prevState => ({
...prevState,
...update
}));
};
const setExample = (i, type, newValue = null) => {
let update = {};
let current = conversation?.examples.slice() || [];
let currentExample = { ...current[i] } || {};
currentExample[type] = { content: newValue };
current[i] = currentExample;
update.examples = current;
setConversation(prevState => ({
...prevState,
...update
}));
};
const addExample = () => {
let update = {};
let current = conversation?.examples.slice() || [];
current.push({ input: { content: '' }, output: { content: '' } });
update.examples = current;
setConversation(prevState => ({
...prevState,
...update
}));
};
const removeExample = () => {
let update = {};
let current = conversation?.examples.slice() || [];
if (current.length <= 1) {
update.examples = [{ input: { content: '' }, output: { content: '' } }];
setConversation(prevState => ({
...prevState,
...update
}));
return;
}
current.pop();
update.examples = current;
setConversation(prevState => ({
...prevState,
...update
}));
};
const cardStyle =
'transition-colors shadow-md rounded-md min-w-[75px] font-normal bg-white border-black/10 hover:border-black/10 focus:border-black/10 dark:border-black/10 dark:hover:border-black/10 dark:focus:border-black/10 border dark:bg-gray-700 text-black dark:text-white';
return (
<>
<div
className={
'openAIOptions-simple-container flex w-full flex-wrap items-center justify-center gap-2' +
(!advancedMode ? ' show' : '')
}
>
<SelectDropDown
value={model}
setValue={setOption('model')}
availableValues={models}
showAbove={true}
showLabel={false}
className={cn(
cardStyle,
'min-w-48 z-50 flex h-[40px] w-48 flex-none items-center justify-center px-4 ring-0 hover:cursor-pointer hover:bg-slate-50 focus:ring-0 focus:ring-offset-0 data-[state=open]:bg-slate-50 dark:bg-gray-700 dark:hover:bg-gray-600 dark:data-[state=open]:bg-gray-600'
)}
/>
<Button
type="button"
className={cn(
cardStyle,
'min-w-4 z-50 flex h-[40px] flex-none items-center justify-center px-4 hover:bg-slate-50 focus:ring-0 focus:ring-offset-0 dark:hover:bg-gray-600'
)}
onClick={triggerAdvancedMode}
>
<Settings2 className="w-4 text-gray-600 dark:text-white" />
</Button>
</div>
<EndpointOptionsPopover
content={
<div className="px-4 py-4">
{showExamples ? (
<Examples
examples={examples}
setExample={setExample}
addExample={addExample}
removeExample={removeExample}
/>
) : (
<Settings
model={model}
modelLabel={modelLabel}
promptPrefix={promptPrefix}
temperature={temperature}
topP={topP}
topK={topK}
maxOutputTokens={maxOutputTokens}
setOption={setOption}
/>
)}
</div>
}
visible={advancedMode}
saveAsPreset={saveAsPreset}
switchToSimpleMode={switchToSimpleMode}
additionalButton={{
label: (showExamples ? 'Hide' : 'Show') + ' Examples',
handler: triggerExamples,
icon: <MessagesSquared className="mr-1 w-[14px]" />
}}
/>
<SaveAsPresetDialog
open={saveAsDialogShow}
onOpenChange={setSaveAsDialogShow}
preset={conversation}
/>
</>
);
}
export default GoogleOptions;

View File

@@ -7,6 +7,14 @@ import SetTokenDialog from '../SetTokenDialog';
import store from '../../../store';
const alternateName = {
openAI: 'OpenAI',
azureOpenAI: 'Azure OpenAI',
bingAI: 'Bing',
chatGPTBrowser: 'ChatGPT',
google: 'PaLM',
}
export default function ModelItem({ endpoint, value, onSelect }) {
const [setTokenDialogOpen, setSetTokenDialogOpen] = useState(false);
const endpointsConfig = useRecoilValue(store.endpointsConfig);
@@ -28,7 +36,7 @@ export default function ModelItem({ endpoint, value, onSelect }) {
className="group dark:font-semibold dark:text-gray-100 dark:hover:bg-gray-800"
>
{icon}
{endpoint}
{alternateName[endpoint] || endpoint}
{!!['azureOpenAI', 'openAI'].find(e => e === endpoint) && <sup>$</sup>}
<div className="flex w-4 flex-1" />
{isuserProvide ? (

View File

@@ -1,13 +1,10 @@
import React from 'react';
import { useState } from 'react';
import { FileUp } from 'lucide-react';
import cleanupPreset from '~/utils/cleanupPreset.js';
import { useRecoilValue } from 'recoil';
import { cn } from '~/utils/';
import store from '~/store';
const FileUpload = ({ onFileSelected }) => {
// const setPresets = useSetRecoilState(store.presets);
const endpointsConfig = useRecoilValue(store.endpointsConfig);
const FileUpload = ({ onFileSelected, successText = null, invalidText = null, validator = null, text = null, id = '1' }) => {
const [statusColor, setStatusColor] = useState('text-gray-600');
const [status, setStatus] = useState(null);
const handleFileChange = event => {
const file = event.target.files[0];
@@ -16,20 +13,34 @@ const FileUpload = ({ onFileSelected }) => {
const reader = new FileReader();
reader.onload = e => {
const jsonData = JSON.parse(e.target.result);
onFileSelected({ ...cleanupPreset({ preset: jsonData, endpointsConfig }), presetId: null });
if (validator && !validator(jsonData)) {
setStatus('invalid');
setStatusColor('text-red-600');
return;
}
if (validator) {
setStatus('success');
setStatusColor('text-green-500 dark:text-green-500');
}
onFileSelected(jsonData);
};
reader.readAsText(file);
};
return (
<label
htmlFor="file-upload"
className=" mr-1 flex h-auto cursor-pointer items-center rounded bg-transparent px-2 py-1 text-xs font-medium font-normal text-gray-600 transition-colors hover:bg-slate-200 hover:text-green-700 dark:bg-transparent dark:text-gray-300 dark:hover:bg-gray-800 dark:hover:text-green-500"
htmlFor={`file-upload-${id}`}
className={cn(
'mr-1 flex h-auto cursor-pointer items-center rounded bg-transparent px-2 py-1 text-xs font-medium font-normal transition-colors hover:bg-slate-200 hover:text-green-700 dark:bg-transparent dark:text-gray-300 dark:hover:bg-gray-800 dark:hover:text-green-500',
statusColor
)}
>
<FileUp className="mr-1 flex w-[22px] items-center stroke-1" />
<span className="flex text-xs ">Import</span>
<span className="flex text-xs ">{!status ? text || 'Import' : (status === 'success' ? successText : invalidText)}</span>
<input
id="file-upload"
id={`file-upload-${id}`}
value=""
type="file"
className="hidden "

View File

@@ -21,6 +21,10 @@ export default function PresetItem({ preset = {}, value, onSelect, onChangePrese
const { chatGptLabel, model } = preset;
if (model) _title += `: ${model}`;
if (chatGptLabel) _title += ` as ${chatGptLabel}`;
} else if (endpoint === 'google') {
const { modelLabel, model } = preset;
if (model) _title += `: ${model}`;
if (modelLabel) _title += ` as ${modelLabel}`;
} else if (endpoint === 'bingAI') {
const { jailbreak, toneStyle } = preset;
if (toneStyle) _title += `: ${toneStyle}`;

View File

@@ -1,4 +1,5 @@
import React, { useState, useEffect } from 'react';
import cleanupPreset from '~/utils/cleanupPreset.js';
import { useRecoilValue, useRecoilState } from 'recoil';
import EditPresetDialog from '../../Endpoints/EditPresetDialog';
import EndpointItems from './EndpointItems';
@@ -23,10 +24,12 @@ import store from '~/store';
export default function NewConversationMenu() {
const [menuOpen, setMenuOpen] = useState(false);
const [showPresets, setShowPresets] = useState(true);
const [presetModelVisible, setPresetModelVisible] = useState(false);
const [preset, setPreset] = useState(false);
const availableEndpoints = useRecoilValue(store.availableEndpoints);
const endpointsConfig = useRecoilValue(store.endpointsConfig);
const [presets, setPresets] = useRecoilState(store.presets);
const conversation = useRecoilValue(store.conversation) || {};
@@ -50,6 +53,11 @@ export default function NewConversationMenu() {
);
};
const onFileSelected = jsonData => {
const jsonPreset = { ...cleanupPreset({ preset: jsonData, endpointsConfig }), presetId: null };
importPreset(jsonPreset);
};
// update the default model when availableModels changes
// typically, availableModels changes => modelsFilter or customGPTModels changes
useEffect(() => {
@@ -64,7 +72,10 @@ export default function NewConversationMenu() {
if (endpoint) {
const lastSelectedModel = JSON.parse(localStorage.getItem('lastSelectedModel')) || {};
localStorage.setItem('lastConversationSetup', JSON.stringify(conversation));
localStorage.setItem('lastSelectedModel', JSON.stringify({ ...lastSelectedModel, [endpoint] : conversation.model }));
localStorage.setItem(
'lastSelectedModel',
JSON.stringify({ ...lastSelectedModel, [endpoint]: conversation.model })
);
}
}, [conversation]);
@@ -149,9 +160,14 @@ export default function NewConversationMenu() {
<div className="mt-6 w-full" />
<DropdownMenuLabel className="flex items-center dark:text-gray-300">
<span>Select a Preset</span>
<span
className="cursor-pointer"
onClick={() => setShowPresets(prev => !prev)}
>
{showPresets ? 'Hide ' : 'Show '} Presets
</span>
<div className="flex-1" />
<FileUpload onFileSelected={importPreset} />
<FileUpload onFileSelected={onFileSelected} />
<Dialog>
<DialogTrigger asChild>
<label
@@ -181,18 +197,19 @@ export default function NewConversationMenu() {
<DropdownMenuSeparator />
<DropdownMenuRadioGroup
onValueChange={onSelectPreset}
className="overflow-y-auto"
className="max-h-[150px] overflow-y-auto"
>
{presets.length ? (
<PresetItems
presets={presets}
onSelect={onSelectPreset}
onChangePreset={onChangePreset}
onDeletePreset={onDeletePreset}
/>
) : (
<DropdownMenuLabel className="dark:text-gray-300">No preset yet.</DropdownMenuLabel>
)}
{showPresets &&
(presets.length ? (
<PresetItems
presets={presets}
onSelect={onSelectPreset}
onChangePreset={onChangePreset}
onDeletePreset={onDeletePreset}
/>
) : (
<DropdownMenuLabel className="dark:text-gray-300">No preset yet.</DropdownMenuLabel>
))}
</DropdownMenuRadioGroup>
</DropdownMenuContent>
</DropdownMenu>

View File

@@ -1,12 +1,10 @@
import React, { useEffect, useState } from 'react';
import { useRecoilValue } from 'recoil';
import DialogTemplate from '../../ui/DialogTemplate';
import { Dialog } from '../../ui/Dialog.tsx';
import { Input } from '../../ui/Input.tsx';
import { Label } from '../../ui/Label.tsx';
import { cn } from '~/utils/';
import cleanupPreset from '~/utils/cleanupPreset';
import { useCreatePresetMutation } from '~/data-provider';
import FileUpload from '../NewConversationMenu/FileUpload';
import store from '~/store';
const SetTokenDialog = ({ open, onOpenChange, endpoint }) => {
@@ -54,6 +52,30 @@ const SetTokenDialog = ({ open, onOpenChange, endpoint }) => {
</a>
. Copy access token.
</small>
),
google: (
<small className="break-all text-gray-600">
You need to{' '}
<a
target="_blank"
href="https://console.cloud.google.com/vertex-ai"
rel="noreferrer"
className="text-blue-600 underline"
>
Enable Vertex AI
</a>{' '}
API on Google Cloud, then{' '}
<a
target="_blank"
href="https://console.cloud.google.com/projectselector/iam-admin/serviceaccounts/create?walkthrough_id=iam--create-service-account#step_index=1"
rel="noreferrer"
className="text-blue-600 underline"
>
Create a Service Account
</a>
. Make sure to click 'Create and Continue' to give at least the 'Vertex AI User' role. Lastly, create
a JSON key to import here.
</small>
)
};
@@ -73,19 +95,61 @@ const SetTokenDialog = ({ open, onOpenChange, endpoint }) => {
Token Name
<br />
</Label>
<Input
id="chatGptLabel"
value={token || ''}
onChange={e => setToken(e.target.value || '')}
placeholder="Set the token."
className={cn(
defaultTextProps,
'flex h-10 max-h-10 w-full resize-none px-3 py-2 focus:outline-none focus:ring-0 focus:ring-opacity-0 focus:ring-offset-0'
)}
/>
<small className="text-red-600">
Your token will be sent to the server, but not saved.
</small>
{endpoint === 'google' ? (
<FileUpload
id="googleKey"
className="w-full"
text="Import Service Account JSON Key"
successText="Successfully Imported Service Account JSON Key"
invalidText="Invalid Service Account JSON Key, Did you import the correct file?"
validator={credentials => {
if (!credentials) {
return false;
}
if (
!credentials.client_email ||
typeof credentials.client_email !== 'string' ||
credentials.client_email.length <= 2
) {
return false;
}
if (
!credentials.project_id ||
typeof credentials.project_id !== 'string' ||
credentials.project_id.length <= 2
) {
return false;
}
if (
!credentials.private_key ||
typeof credentials.private_key !== 'string' ||
credentials.private_key.length <= 600
) {
return false;
}
return true;
}}
onFileSelected={data => {
setToken(JSON.stringify(data));
}}
/>
) : (
<Input
id="chatGptLabel"
value={token || ''}
onChange={e => setToken(e.target.value || '')}
placeholder="Set the token."
className={cn(
defaultTextProps,
'flex h-10 max-h-10 w-full resize-none px-3 py-2 focus:outline-none focus:ring-0 focus:ring-opacity-0 focus:ring-offset-0'
)}
/>
)}
<small className="text-red-600">Your token will be sent to the server, but not saved.</small>
{helpText?.[endpoint]}
</div>
}

View File

@@ -4,6 +4,7 @@ import SubmitButton from './SubmitButton';
import OpenAIOptions from './OpenAIOptions';
import ChatGPTOptions from './ChatGPTOptions';
import BingAIOptions from './BingAIOptions';
import GoogleOptions from './GoogleOptions';
// import BingStyles from './BingStyles';
import NewConversationMenu from './NewConversationMenu';
import AdjustToneButton from './AdjustToneButton';
@@ -139,6 +140,7 @@ export default function TextChat({ isSearchView = false }) {
<span className="flex w-full flex-col items-center justify-center gap-0 md:order-none md:m-auto md:gap-2">
<OpenAIOptions />
<ChatGPTOptions />
<GoogleOptions />
<BingAIOptions show={showBingToneSetting} />
</span>
</div>

View File

@@ -1,5 +1,6 @@
import React from 'react';
import Clipboard from '../svg/Clipboard';
import CheckMark from '../svg/CheckMark';
import EditIcon from '../svg/EditIcon';
import RegenerateIcon from '../svg/RegenerateIcon';
@@ -13,10 +14,11 @@ export default function HoverButtons({
regenerate
}) {
const { endpoint, jailbreak = false } = conversation;
const [isCopied, setIsCopied] = React.useState(false);
const branchingSupported =
// azureOpenAI, openAI, chatGPTBrowser support branching, so edit enabled
!!['azureOpenAI', 'openAI', 'chatGPTBrowser'].find(e => e === endpoint) ||
!!['azureOpenAI', 'openAI', 'chatGPTBrowser', 'google'].find(e => e === endpoint) ||
// Sydney in bingAI supports branching, so edit enabled
(endpoint === 'bingAI' && jailbreak);
@@ -59,11 +61,11 @@ export default function HoverButtons({
<button
className="hover-button rounded-md p-1 hover:bg-gray-100 hover:text-gray-700 dark:text-gray-400 dark:hover:bg-gray-700 dark:hover:text-gray-200 disabled:dark:hover:text-gray-400 md:invisible md:group-hover:visible"
onClick={copyToClipboard}
onClick={() => copyToClipboard(setIsCopied)}
type="button"
title="copy to clipboard"
title={isCopied ? 'Copied to clipboard' : 'Copy to clipboard'}
>
<Clipboard />
{isCopied ? <CheckMark /> : <Clipboard />}
</button>
</div>
);

View File

@@ -98,8 +98,13 @@ export default function Message({
if (!isSubmitting && !message?.isCreatedByUser) regenerate(message);
};
const copyToClipboard = () => {
const copyToClipboard = (setIsCopied) => {
setIsCopied(true);
copy(message?.text);
setTimeout(() => {
setIsCopied(false);
}, 3000);
};
const clickSearchResult = async () => {
@@ -217,7 +222,7 @@ export default function Message({
conversation={conversation}
enterEdit={() => enterEdit()}
regenerate={() => regenerateMessage()}
copyToClipboard={() => copyToClipboard()}
copyToClipboard={copyToClipboard}
/>
<SubRow subclasses="switch-container">
<SiblingSwitch

View File

@@ -31,6 +31,11 @@ const MessageHeader = ({ isSearchView = false }) => {
const { chatGptLabel, model } = conversation;
if (model) _title += `: ${model}`;
if (chatGptLabel) _title += ` as ${chatGptLabel}`;
} else if (endpoint === 'google') {
_title = 'PaLM';
const { modelLabel, model } = conversation;
if (model) _title += `: ${model}`;
if (modelLabel) _title += ` as ${modelLabel}`;
} else if (endpoint === 'bingAI') {
const { jailbreak, toneStyle } = conversation;
if (toneStyle) _title += `: ${toneStyle}`;

View File

@@ -26,8 +26,7 @@ export default function ClearConvos() {
<Dialog>
<DialogTrigger asChild>
<button
className="flex cursor-pointer items-center gap-3 rounded-md py-3 px-3 text-sm text-white transition-colors duration-200 hover:bg-gray-500/10"
// onClick={clickHandler}
className="flex w-full cursor-pointer items-center gap-3 px-3 py-3 text-sm text-white transition-colors duration-200 hover:bg-gray-700"
>
<TrashIcon />
Clear conversations

View File

@@ -11,7 +11,7 @@ export default function DarkMode() {
return (
<button
className="flex cursor-pointer items-center gap-3 rounded-md py-3 px-3 text-sm text-white transition-colors duration-200 hover:bg-gray-500/10"
className="flex w-full cursor-pointer items-center gap-3 px-3 py-3 text-sm text-white transition-colors duration-200 hover:bg-gray-700"
onClick={clickHandler}
>
{theme === 'dark' ? <LightModeIcon /> : <DarkModeIcon />}

View File

@@ -25,7 +25,7 @@ export default function ExportConversation() {
<>
<button
className={cn(
'flex items-center gap-3 rounded-md py-3 px-3 text-sm transition-colors duration-200 hover:bg-gray-500/10',
'flex py-3 px-3 items-center gap-3 transition-colors duration-200 text-white cursor-pointer text-sm hover:bg-gray-700 w-full',
exportable ? 'cursor-pointer text-white' : 'cursor-not-allowed text-gray-400'
)}
onClick={clickHandler}

View File

@@ -12,7 +12,7 @@ export default function Logout() {
return (
<button
className="flex cursor-pointer items-center gap-3 rounded-md py-3 px-3 text-sm text-white transition-colors duration-200 hover:bg-gray-500/10"
className="flex py-3 px-3 items-center gap-3 transition-colors duration-200 text-white cursor-pointer text-sm hover:bg-gray-700 w-full"
onClick={handleLogout}
>
<LogOutIcon />

View File

@@ -1,21 +1,77 @@
import { Menu, Transition } from '@headlessui/react';
import { Fragment, useEffect, useRef, useState } from 'react';
import SearchBar from './SearchBar';
import ClearConvos from './ClearConvos';
import DarkMode from './DarkMode';
import Logout from './Logout';
import ExportConversation from './ExportConversation';
import { useAuthContext } from '~/hooks/AuthContext';
import { cn } from '~/utils/';
import DotsIcon from '../svg/DotsIcon';
export default function NavLinks({ clearSearch, isSearchEnabled }) {
const { user, logout } = useAuthContext();
return (
<>
{!!isSearchEnabled && (
<SearchBar
clearSearch={clearSearch}
/>
<Menu
as="div"
className="group relative"
>
{({ open }) => (
<>
<Menu.Button
className={cn(
'group-ui-open:bg-gray-800 flex w-full items-center gap-2.5 rounded-md px-3 py-3 text-sm transition-colors duration-200 hover:bg-gray-800',
open ? 'bg-gray-800' : ''
)}
>
<div className="-ml-0.5 h-5 w-5 flex-shrink-0">
<div className="relative flex">
<img
className="rounded-sm"
src={user?.avatar || `https://avatars.dicebear.com/api/initials/${user?.name}.svg`}
alt=""
/>
</div>
</div>
<div className="grow overflow-hidden text-ellipsis whitespace-nowrap text-left text-white">
{user?.name || 'USER'}
</div>
<DotsIcon />
</Menu.Button>
<Transition
as={Fragment}
enter="transition ease-out duration-100"
enterFrom="transform opacity-0 scale-95"
enterTo="transform opacity-100 scale-100"
leave="transition ease-in duration-75"
leaveFrom="transform opacity-100 scale-100"
leaveTo="transform opacity-0 scale-95"
>
<Menu.Items className="absolute bottom-full left-0 z-20 mb-2 w-full translate-y-0 overflow-hidden rounded-md bg-[#050509] py-1.5 opacity-100 outline-none">
<Menu.Item>
{({}) => <>{!!isSearchEnabled && <SearchBar clearSearch={clearSearch} />}</>}
</Menu.Item>
<Menu.Item>{({}) => <ExportConversation />}</Menu.Item>
<div
className="my-1.5 h-px bg-white/20"
role="none"
></div>
<Menu.Item>{({}) => <DarkMode />}</Menu.Item>
<Menu.Item>{({}) => <ClearConvos />}</Menu.Item>
<div
className="my-1.5 h-px bg-white/20"
role="none"
></div>
<Menu.Item>
<Logout />
</Menu.Item>
</Menu.Items>
</Transition>
</>
)}
<ExportConversation />
<DarkMode />
<ClearConvos />
<Logout />
</>
</Menu>
);
}

View File

@@ -142,8 +142,8 @@ export default function Nav({ navVisible, setNavVisible }) {
}
>
<div className="flex h-full min-h-0 flex-col ">
<div className="scrollbar-trigger flex h-full w-full flex-1 items-start border-white/20">
<nav className="flex h-full flex-1 flex-col space-y-1 p-2">
<div className="scrollbar-trigger flex h-full w-full flex-1 items-start border-white/20 relative">
<nav className="flex h-full flex-1 flex-col space-y-1 p-2 relative">
<NewChat />
<div
className={`flex-1 flex-col overflow-y-auto ${

View File

@@ -0,0 +1,34 @@
import React from 'react';
export default function DotsIcon() {
return (
<svg
stroke="currentColor"
fill="none"
strokeWidth="2"
viewBox="0 0 24 24"
strokeLinecap="round"
strokeLinejoin="round"
className="h-4 w-4 flex-shrink-0 text-gray-500"
height="1em"
width="1em"
xmlns="http://www.w3.org/2000/svg"
>
<circle
cx="12"
cy="12"
r="1"
/>
<circle
cx="19"
cy="12"
r="1"
/>
<circle
cx="5"
cy="12"
r="1"
/>
</svg>
);
}

View File

@@ -0,0 +1,21 @@
import { cn } from '~/utils/';
export default function MessagesSquared({ className }) {
return (
<svg
xmlns="http://www.w3.org/2000/svg"
width="24"
height="24"
viewBox="0 0 24 24"
fill="none"
stroke="currentColor"
strokeWidth="2"
strokeLinecap="round"
strokeLinejoin="round"
className={cn(className, 'lucide lucide-messages-square')}
>
<path d="M14 9a2 2 0 0 1-2 2H6l-4 4V4c0-1.1.9-2 2-2h8a2 2 0 0 1 2 2v5Z" />
<path d="M18 9h2a2 2 0 0 1 2 2v11l-4-4h-6a2 2 0 0 1-2-2v-1" />
</svg>
);
}

View File

@@ -29,26 +29,6 @@ export default function DialogTemplate({
<DialogTitle className="text-gray-800 dark:text-white">{title}</DialogTitle>
<DialogDescription className="text-gray-600 dark:text-gray-300">{description}</DialogDescription>
</DialogHeader>
{/* <div className="grid gap-4 py-4">
<div className="grid grid-cols-4 items-center gap-4"> //input template
</div>
<div className="grid grid-cols-4 items-center gap-4">
<Label
htmlFor="promptPrefix"
className="text-right"
>
Prompt Prefix
</Label>
<TextareaAutosize
id="promptPrefix"
value={promptPrefix}
onChange={(e) => setPromptPrefix(e.target.value)}
placeholder="Set custom instructions. Defaults to: 'You are ChatGPT, a large language model trained by OpenAI.'"
className="col-span-3 flex h-20 w-full resize-none rounded-md border border-gray-300 bg-transparent py-2 px-3 text-sm shadow-[0_0_10px_rgba(0,0,0,0.10)] outline-none placeholder:text-gray-400 focus:outline-none focus:ring-gray-400 focus:ring-opacity-20 focus:ring-offset-2 disabled:cursor-not-allowed disabled:opacity-50 dark:border-none dark:bg-gray-700 dark:text-gray-50 dark:shadow-[0_0_15px_rgba(0,0,0,0.10)] dark:focus:border-none dark:focus:border-transparent dark:focus:outline-none dark:focus:ring-0 dark:focus:ring-gray-400 dark:focus:ring-offset-0"
/>
</div>
</div> */}
{main ? main : null}
<DialogFooter>
<div>{leftButtons ? leftButtons : null}</div>

View File

@@ -8,6 +8,7 @@ export default function createPayload(submission: TSubmission) {
const endpointUrlMap = {
azureOpenAI: '/api/ask/azureOpenAI',
openAI: '/api/ask/openAI',
google: '/api/ask/google',
bingAI: '/api/ask/bingAI',
chatGPTBrowser: '/api/ask/chatGPTBrowser'
};

View File

@@ -11,6 +11,11 @@ export type TMessage = {
updatedAt: string,
};
export type TExample = {
input: string,
output: string,
};
export type TSubmission = {
clientId?: string;
context?: string;
@@ -43,7 +48,8 @@ export enum EModelEndpoint {
openAI = 'openAI',
bingAI = 'bingAI',
chatGPT = 'chatGPT',
chatGPTBrowser = 'chatGPTBrowser'
chatGPTBrowser = 'chatGPTBrowser',
google = 'google',
}
export type TConversation = {
@@ -55,11 +61,19 @@ export type TConversation = {
messages?: TMessage[];
createdAt: string;
updatedAt: string;
// google only
modelLabel?: string;
examples?: TExample[];
// for azureOpenAI, openAI only
chatGptLabel?: string;
userLabel?: string;
model?: string;
promptPrefix?: string;
temperature?: number;
topP?: number;
topK?: number;
// bing and google
context?: string;
top_p?: number;
presence_penalty?: number;
// for bingAI only

View File

@@ -6,7 +6,8 @@ const endpointsConfig = atom({
azureOpenAI: null,
openAI: null,
bingAI: null,
chatGPTBrowser: null
chatGPTBrowser: null,
google: null,
}
});
@@ -24,7 +25,7 @@ const endpointsFilter = selector({
const availableEndpoints = selector({
key: 'availableEndpoints',
get: ({ get }) => {
const endpoints = ['azureOpenAI', 'openAI', 'bingAI', 'chatGPTBrowser'];
const endpoints = ['azureOpenAI', 'openAI', 'chatGPTBrowser', 'bingAI', 'google'];
const f = get(endpointsFilter);
return endpoints.filter(endpoint => f[endpoint]);
}

View File

@@ -2,6 +2,110 @@
@tailwind components;
@tailwind utilities;
@font-face {
font-display: swap;
font-family: Signifier;
font-style: normal;
font-weight: 400;
src: url("../fonts/signifier-light.woff2") format("woff2")
}
@font-face {
font-display: swap;
font-family: Signifier;
font-style: italic;
font-weight: 400;
src: url("../fonts/signifier-light-italic.woff2") format("woff2")
}
@font-face {
font-display: swap;
font-family: Signifier;
font-style: normal;
font-weight: 700;
src: url("../fonts/signifier-bold.woff2") format("woff2")
}
@font-face {
font-display: swap;
font-family: Signifier;
font-style: italic;
font-weight: 700;
src: url("../fonts/signifier-bold-italic.woff2") format("woff2")
}
@font-face {
font-display: swap;
font-family: Söhne;
font-style: normal;
font-weight: 400;
src: url("../fonts/soehne-buch.woff2") format("woff2")
}
@font-face {
font-display: swap;
font-family: Söhne;
font-style: italic;
font-weight: 400;
src: url("../fonts/soehne-buch-kursiv.woff2") format("woff2")
}
@font-face {
font-display: swap;
font-family: Söhne;
font-style: normal;
font-weight: 500;
src: url("../fonts/soehne-kraftig.woff2") format("woff2")
}
@font-face {
font-display: swap;
font-family: Söhne;
font-style: italic;
font-weight: 500;
src: url("../fonts/soehne-kraftig-kursiv.woff2") format("woff2")
}
@font-face {
font-display: swap;
font-family: Söhne;
font-style: normal;
font-weight: 600;
src: url("../fonts/soehne-halbfett.woff2") format("woff2")
}
@font-face {
font-display: swap;
font-family: Söhne;
font-style: italic;
font-weight: 600;
src: url("../fonts/soehne-halbfett-kursiv.woff2") format("woff2")
}
@font-face {
font-display: swap;
font-family: Söhne Mono;
font-style: normal;
font-weight: 400;
src: url("../fonts/soehne-mono-buch.woff2") format("woff2")
}
@font-face {
font-display: swap;
font-family: Söhne Mono;
font-style: normal;
font-weight: 700;
src: url("../fonts/soehne-mono-halbfett.woff2") format("woff2")
}
@font-face {
font-display: swap;
font-family: Söhne Mono;
font-style: italic;
font-weight: 400;
src: url("../fonts/soehne-mono-buch-kursiv.woff2") format("woff2")
}
/* * {
box-sizing: border-box;
outline: 1px solid limegreen !important;

View File

@@ -15,6 +15,20 @@ const cleanupPreset = ({ preset: _preset, endpointsConfig = {} }) => {
frequency_penalty: _preset?.frequency_penalty ?? 0,
title: _preset?.title ?? 'New Preset'
};
} else if (endpoint === 'google') {
preset = {
endpoint,
presetId: _preset?.presetId ?? null,
model: _preset?.model ?? endpointsConfig[endpoint]?.availableModels?.[0] ?? 'chat-bison',
modelLabel: _preset?.modelLabel ?? null,
examples: _preset?.examples ?? [{ input: { content: '' }, output: { content: '' } }],
promptPrefix: _preset?.promptPrefix ?? null,
temperature: _preset?.temperature ?? 0.2,
maxOutputTokens: _preset?.maxOutputTokens ?? 1024,
topP: _preset?.topP ?? 0.95,
topK: _preset?.topK ?? 40,
title: _preset?.title ?? 'New Preset'
};
} else if (endpoint === 'bingAI') {
preset = {
endpoint,

View File

@@ -22,6 +22,23 @@ const buildDefaultConversation = ({
presence_penalty: lastConversationSetup?.presence_penalty ?? 0,
frequency_penalty: lastConversationSetup?.frequency_penalty ?? 0
};
} else if (endpoint === 'google') {
conversation = {
...conversation,
endpoint,
model:
lastConversationSetup?.model ??
lastSelectedModel[endpoint] ??
endpointsConfig[endpoint]?.availableModels?.[0] ??
'chat-bison',
modelLabel: lastConversationSetup?.modelLabel ?? null,
promptPrefix: lastConversationSetup?.promptPrefix ?? null,
examples: lastConversationSetup?.examples ?? [{ input: { content: '' }, output: { content: '' }}],
temperature: lastConversationSetup?.temperature ?? 0.2,
maxOutputTokens: lastConversationSetup?.maxOutputTokens ?? 1024,
topP: lastConversationSetup?.topP ?? 0.95,
topK: lastConversationSetup?.topK ?? 40,
};
} else if (endpoint === 'bingAI') {
conversation = {
...conversation,
@@ -108,7 +125,7 @@ const getDefaultConversation = ({ conversation, prevConversation, endpointsConfi
// if anything happens, reset to default model
const endpoint = ['openAI', 'azureOpenAI', 'bingAI', 'chatGPTBrowser'].find(e => endpointsConfig?.[e]);
const endpoint = ['openAI', 'azureOpenAI', 'bingAI', 'chatGPTBrowser', 'google'].find(e => endpointsConfig?.[e]);
if (endpoint) {
conversation = buildDefaultConversation({ conversation, endpoint, endpointsConfig });
return conversation;

View File

@@ -42,6 +42,10 @@ const getIcon = props => {
? `rgba(16, 163, 127, ${button ? 0.75 : 1})`
: `rgba(16, 163, 127, ${button ? 0.75 : 1})`);
name = chatGptLabel || 'ChatGPT';
} else if (endpoint === 'google') {
const { modelLabel } = props;
icon = <img src='/assets/palm.png' />;
name = modelLabel || 'PaLM2';
} else if (endpoint === 'bingAI') {
const { jailbreak } = props;

View File

@@ -40,6 +40,21 @@ const useMessageHandler = () => {
frequency_penalty: currentConversation?.frequency_penalty ?? 0
};
responseSender = endpointOption.chatGptLabel ?? 'ChatGPT';
} else if (endpoint === 'google') {
endpointOption = {
endpoint,
model:
currentConversation?.model ?? endpointsConfig[endpoint]?.availableModels?.[0] ?? 'chat-bison',
chatGptLabel: currentConversation?.chatGptLabel ?? null,
promptPrefix: currentConversation?.promptPrefix ?? null,
examples: currentConversation?.examples ?? [{ input: { content: '' }, output: { content: '' }}],
temperature: currentConversation?.temperature ?? 0.2,
maxOutputTokens: currentConversation?.maxOutputTokens ?? 1024,
topP: currentConversation?.topP ?? 0.95,
topK: currentConversation?.topK ?? 40,
token: endpointsConfig[endpoint]?.userProvide ? getToken() : null
};
responseSender = endpointOption.chatGptLabel ?? 'ChatGPT';
} else if (endpoint === 'bingAI') {
endpointOption = {
endpoint,
@@ -125,7 +140,7 @@ const useMessageHandler = () => {
initialResponse
};
console.log('User Input:', text);
console.log('User Input:', text, submission);
if (isRegenerate) {
setMessages([...currentMessages, initialResponse]);

View File

@@ -9,10 +9,11 @@ module.exports = {
// colors: {
// 'gpt-dark-gray': '#343541',
// },
fontFamily: {
sans: ['Söhne', 'sans-serif'],
mono: ['Söhne Mono', 'monospace'],
},
extend: {
// fontFamily: {
// sans: ['var(--font-sans)', ...fontFamily.sans]
// },
keyframes: {
'accordion-down': {
from: { height: 0 },
@@ -52,7 +53,7 @@ module.exports = {
800: "#06373e",
900: "#031f29",
},
}
}
}
},
plugins: [

View File

@@ -1,4 +1,6 @@
# Linux Installation
Thanks to @DavidDev1334 !
##
## Prerequisites
@@ -86,15 +88,13 @@ You will need all your credentials, (API keys, access tokens, and MongoDB Connec
## Run the project
### Using the command line
### Using the command line (in the root directory)
1. Run `npm ci` in the "/home/user/chatgpt-clone/api" directory
2. Run `npm ci` in the "/home/user/chatgpt-clone/client" directory
3. Run `npm run build` in the "/home/user/chatgpt-clone/client"
4. Run `meilisearch --master-key put_your_meilesearch_Master_Key_here` in the "/home/user/chat
5. Run "meilisearch --master-key put_your_meilesearch_Master_Key_here" in the "/home/user/chatgpt-clone" directory (Only if SEARCH=TRUE)
6. Run npm start in the "/home/user/chatgpt-clone/api" directory
7. Visit http://localhost:3080 (default port) & enjoy
1. Run `npm ci`
2. Run `npm run frontend-dev`
3. Run `npm run backend`
4. Run `meilisearch --master-key put_your_meilesearch_Master_Key_here` (Only if SEARCH=TRUE)
5. Visit http://localhost:3080 (default port) & enjoy
### Using a shell script

View File

@@ -58,14 +58,8 @@ Follow the instructions for setting up proxies, access tokens, and user system:
- Create a .env file in the api directory by running cp api/.env.example api/.env and edit the file with your preferred text editor, adding the required API keys, access tokens, and MongoDB connection string
- Run npm ci in both the api and client directories by running:
```
cd api && npm ci && cd ..
cd client && npm ci && cd ..
```
- Build the client by running cd client && npm run build && cd ..
- Run npm ci root directory `npm ci`
- Build the client by running `npm run frontend-dev`
**Download MeiliSearch for macOS:**
- You can download the latest MeiliSearch binary for macOS from their GitHub releases page: https://github.com/meilisearch/MeiliSearch/releases. Look for the file named meilisearch-macos-amd64 (or the equivalent for your system architecture) and download it.

View File

@@ -66,15 +66,13 @@ You will need all your credentials, (API keys, access tokens, and Mongo Connecti
### Run the app
#### Using the command line
### Using the command line (in the root directory)
- **Run** `npm ci` in the "C:/chatgpt-clone/api" directory
- **Run** `npm ci` in the "C:/chatgpt-clone/client" directory
- **Run** `npm run build` in the "C:/chatgpt-clone/client"
- **Run** `"meilisearch --master-key put_your_meilesearch_Master_Key_here"` in the "C:/chatgpt-clone" directory (Only if SEARCH=TRUE)
- **Run** `npm start` in the "C:/chatgpt-clone/api" directory
- **Visit** http://localhost:3080 (default port) & enjoy
1. Run `npm ci`
2. Run `npm run frontend-dev`
3. Run `npm run backend`
4. Run `meilisearch --master-key put_your_meilesearch_Master_Key_here` (Only if SEARCH=TRUE)
5. Visit http://localhost:3080 (default port) & enjoy
#### Using a batch file

26891
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,8 +1,16 @@
{
"name": "chatgpt-clone",
"version": "0.4.1",
"version": "0.4.4",
"description": "",
"workspaces": [
"api",
"client"
],
"scripts": {
"backend": "cd api && npm run start",
"backend-dev": "cd api && npm run server-dev",
"frontend": "cd client && npm run build",
"frontend-dev": "cd client && npm run dev",
"e2e": "playwright test --config=e2e/playwright.config.js",
"e2e:update": "playwright test --config=e2e/playwright.config.js --update-snapshots",
"e2e:debug": "cross-env PWDEBUG=1 playwright test --config=e2e/playwright.config.js",