Compare commits

..

24 Commits
v3.9.0 ... main

Author SHA1 Message Date
Jared Van Bortel
b666d16db5
ci: update path-filtering orb to 1.3.0 (#3588)
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
2025-05-27 15:46:52 -04:00
Jared Van Bortel
cd70db29ed
readme: add Windows ARM download link
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
2025-02-24 19:51:59 -05:00
Jared Van Bortel
fb72ba1ff5 chat: bump version to 3.10.1-dev0
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
2025-02-24 19:44:45 -05:00
Jared Van Bortel
b968d45c11
chat: release version 3.10.0 (#3515)
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
2025-02-24 19:41:13 -05:00
Jared Van Bortel
228d5379cf
chat: cut v3.10.0 release (#3511)
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
2025-02-24 17:15:34 -05:00
Jared Van Bortel
dd820ef7c4
Italian and draft Simplified Chinese translations for v3.10.0 (#3514)
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
2025-02-24 17:14:10 -05:00
Jared Van Bortel
a7cbc8c3fd
Run lupdate before v3.10.0 release (#3512)
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
2025-02-24 15:33:27 -05:00
AT
4d171835ac
Add new remote model provider view. (#3506)
Signed-off-by: Adam Treat <treat.adam@gmail.com>
Signed-off-by: AT <manyoso@users.noreply.github.com>
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
Co-authored-by: Jared Van Bortel <jared@nomic.ai>
2025-02-24 14:59:53 -05:00
Lil Bob
0c28ee7059
Translations: Improve Chinese translation (#3467)
Signed-off-by: Junior2Ran <hdr01@126.com>
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
Co-authored-by: Jared Van Bortel <jared@nomic.ai>
2025-02-20 20:44:28 -05:00
Jared Van Bortel
96aeb44210
backend: build with CUDA compute 5.0 support by default (#3499)
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
2025-02-19 11:27:06 -05:00
Jared Van Bortel
29f29773af
chat: require Qt 6.8 and fix #includes (#3498)
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
2025-02-18 13:59:50 -05:00
Jared Van Bortel
d8c04cead8
ci: use LLVM Clang 19 on macOS and Ubuntu (#3500)
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
2025-02-18 12:02:14 -05:00
Riccardo Giovanetti
b1cb46ec2a
Italian localization update (#3496)
Signed-off-by: Riccardo Giovanetti <riccardo.giovanetti@gmail.com>
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
Co-authored-by: Jared Van Bortel <jared@nomic.ai>
2025-02-18 11:47:39 -05:00
Jared Van Bortel
b83d06e67f translations: run lupdate -no-obsolete on Simplified Chinese
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
2025-02-13 11:27:04 -05:00
Jared Van Bortel
7aa339cf40 translations: run lupdate
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
2025-02-13 11:26:28 -05:00
ThiloteE
1b84182030
Add replacement templates for OLMoE and granite-3.1 (#3471)
Signed-off-by: ThiloteE <73715071+ThiloteE@users.noreply.github.com>
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
Co-authored-by: Jared Van Bortel <jared@nomic.ai>
2025-02-12 14:23:46 -05:00
ThiloteE
02e12089d3
Add Granite arch to model whitelist (#3487)
Signed-off-by: ThiloteE <73715071+ThiloteE@users.noreply.github.com>
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
Co-authored-by: Jared Van Bortel <jared@nomic.ai>
2025-02-12 14:17:49 -05:00
Jared Van Bortel
09f37a0ff8
maintainers: remove extra bracket
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
2025-02-11 14:49:46 -05:00
AT
5e7e4b3f78
Fix spacing issues with deepseek models: (#3470)
Signed-off-by: Adam Treat <treat.adam@gmail.com>
Signed-off-by: AT <manyoso@users.noreply.github.com>
2025-02-06 12:04:32 -05:00
Jared Van Bortel
22ebd42c32
Misc fixes for undefined behavior, crashes, and build failure (#3465)
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
2025-02-06 11:22:52 -05:00
Jared Van Bortel
051a63f031 ci: fix scheduled workflow jobs
s/online/offline/

Signed-off-by: Jared Van Bortel <jared@nomic.ai>
2025-02-05 11:56:53 -05:00
Jared Van Bortel
26356f872e chat: bump version to 3.9.1-dev0
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
2025-02-04 19:15:20 -05:00
Jared Van Bortel
22b8bc546f
chat: release version 3.9.0 (#3462)
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
2025-02-04 19:12:17 -05:00
Jared Van Bortel
52164142de changelog: fix missing paren
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
2025-02-04 18:14:30 -05:00
79 changed files with 3994 additions and 2483 deletions

View File

@ -1,7 +1,7 @@
version: 2.1 version: 2.1
setup: true setup: true
orbs: orbs:
path-filtering: circleci/path-filtering@1.1.0 path-filtering: circleci/path-filtering@1.3.0
workflows: workflows:
version: 2.1 version: 2.1

View File

@ -18,6 +18,68 @@ parameters:
type: boolean type: boolean
default: false default: false
job-macos-executor: &job-macos-executor
macos:
xcode: 16.2.0
resource_class: macos.m1.medium.gen1
environment:
HOMEBREW_NO_AUTO_UPDATE: 1
job-macos-install-deps: &job-macos-install-deps
name: Install basic macOS build dependencies
command: brew install ccache llvm wget
job-linux-install-chat-deps: &job-linux-install-chat-deps
name: Install Linux build dependencies for gpt4all-chat
command: |
# Prevent apt-get from interactively prompting for service restart
echo "\$nrconf{restart} = 'l'" | sudo tee /etc/needrestart/conf.d/90-autorestart.conf >/dev/null
wget -qO- 'https://apt.llvm.org/llvm-snapshot.gpg.key' | sudo tee /etc/apt/trusted.gpg.d/apt.llvm.org.asc >/dev/null
sudo add-apt-repository -yn 'deb http://apt.llvm.org/jammy/ llvm-toolchain-jammy-19 main'
wget -qO- "https://packages.lunarg.com/lunarg-signing-key-pub.asc" \
| sudo tee /etc/apt/trusted.gpg.d/lunarg.asc >/dev/null
wget -qO- "https://packages.lunarg.com/vulkan/1.3.290/lunarg-vulkan-1.3.290-jammy.list" \
| sudo tee /etc/apt/sources.list.d/lunarg-vulkan-1.3.290-jammy.list >/dev/null
wget "https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2204/x86_64/cuda-keyring_1.1-1_all.deb"
sudo dpkg -i cuda-keyring_1.1-1_all.deb
packages=(
bison build-essential ccache clang-19 clang-tools-19 cuda-compiler-11-8 flex gperf libcublas-dev-11-8
libfontconfig1 libfreetype6 libgl1-mesa-dev libmysqlclient21 libnvidia-compute-550-server libodbc2 libpq5
libstdc++-12-dev libwayland-dev libx11-6 libx11-xcb1 libxcb-cursor0 libxcb-glx0 libxcb-icccm4 libxcb-image0
libxcb-keysyms1 libxcb-randr0 libxcb-render-util0 libxcb-shape0 libxcb-shm0 libxcb-sync1 libxcb-util1
libxcb-xfixes0 libxcb-xinerama0 libxcb-xkb1 libxcb1 libxext6 libxfixes3 libxi6 libxkbcommon-dev libxkbcommon-x11-0
libxrender1 patchelf python3 vulkan-sdk python3 vulkan-sdk
)
sudo apt-get update
sudo apt-get install -y "${packages[@]}"
wget "https://qt.mirror.constant.com/archive/online_installers/4.8/qt-online-installer-linux-x64-4.8.1.run"
chmod +x qt-online-installer-linux-x64-4.8.1.run
./qt-online-installer-linux-x64-4.8.1.run --no-force-installations --no-default-installations \
--no-size-checking --default-answer --accept-licenses --confirm-command --accept-obligations \
--email "$QT_EMAIL" --password "$QT_PASSWORD" install \
qt.tools.cmake qt.tools.ifw.48 qt.tools.ninja qt.qt6.682.linux_gcc_64 qt.qt6.682.addons.qt5compat \
qt.qt6.682.debug_info extensions.qtpdf.682 qt.qt6.682.addons.qthttpserver
job-linux-install-backend-deps: &job-linux-install-backend-deps
name: Install Linux build dependencies for gpt4all-backend
command: |
wget -qO- 'https://apt.llvm.org/llvm-snapshot.gpg.key' | sudo tee /etc/apt/trusted.gpg.d/apt.llvm.org.asc >/dev/null
sudo add-apt-repository -yn 'deb http://apt.llvm.org/jammy/ llvm-toolchain-jammy-19 main'
wget -qO- "https://packages.lunarg.com/lunarg-signing-key-pub.asc" \
| sudo tee /etc/apt/trusted.gpg.d/lunarg.asc >/dev/null
wget -qO- "https://packages.lunarg.com/vulkan/1.3.290/lunarg-vulkan-1.3.290-jammy.list" \
| sudo tee /etc/apt/sources.list.d/lunarg-vulkan-1.3.290-jammy.list >/dev/null
wget "https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2204/x86_64/cuda-keyring_1.1-1_all.deb"
sudo dpkg -i cuda-keyring_1.1-1_all.deb
packages=(
build-essential ccache clang-19 clang-tools-19 cuda-compiler-11-8 libcublas-dev-11-8
libnvidia-compute-550-server libstdc++-12-dev vulkan-sdk
)
sudo apt-get update
sudo apt-get install -y "${packages[@]}"
pyenv global 3.13.2
pip install setuptools wheel cmake ninja
jobs: jobs:
# work around CircleCI-Public/path-filtering-orb#20 # work around CircleCI-Public/path-filtering-orb#20
noop: noop:
@ -34,8 +96,7 @@ jobs:
name: Verify that commit is on the main branch name: Verify that commit is on the main branch
command: git merge-base --is-ancestor HEAD main command: git merge-base --is-ancestor HEAD main
build-offline-chat-installer-macos: build-offline-chat-installer-macos:
macos: <<: *job-macos-executor
xcode: 15.4.0
steps: steps:
- checkout - checkout
- run: - run:
@ -46,12 +107,11 @@ jobs:
- restore_cache: - restore_cache:
keys: keys:
- ccache-gpt4all-macos- - ccache-gpt4all-macos-
- run:
<<: *job-macos-install-deps
- run: - run:
name: Install Rosetta name: Install Rosetta
command: softwareupdate --install-rosetta --agree-to-license # needed for QtIFW command: softwareupdate --install-rosetta --agree-to-license # needed for QtIFW
- run:
name: Install dependencies
command: brew install ccache wget
- run: - run:
name: Installing Qt name: Installing Qt
command: | command: |
@ -86,6 +146,9 @@ jobs:
-DCMAKE_BUILD_TYPE=Release \ -DCMAKE_BUILD_TYPE=Release \
-DCMAKE_PREFIX_PATH:PATH=~/Qt/6.8.2/macos/lib/cmake \ -DCMAKE_PREFIX_PATH:PATH=~/Qt/6.8.2/macos/lib/cmake \
-DCMAKE_MAKE_PROGRAM:FILEPATH=~/Qt/Tools/Ninja/ninja \ -DCMAKE_MAKE_PROGRAM:FILEPATH=~/Qt/Tools/Ninja/ninja \
-DCMAKE_C_COMPILER=/opt/homebrew/opt/llvm/bin/clang \
-DCMAKE_CXX_COMPILER=/opt/homebrew/opt/llvm/bin/clang++ \
-DCMAKE_RANLIB=/usr/bin/ranlib \
-DCMAKE_C_COMPILER_LAUNCHER=ccache \ -DCMAKE_C_COMPILER_LAUNCHER=ccache \
-DCMAKE_CXX_COMPILER_LAUNCHER=ccache \ -DCMAKE_CXX_COMPILER_LAUNCHER=ccache \
-DBUILD_UNIVERSAL=ON \ -DBUILD_UNIVERSAL=ON \
@ -128,8 +191,7 @@ jobs:
- upload - upload
sign-offline-chat-installer-macos: sign-offline-chat-installer-macos:
macos: <<: *job-macos-executor
xcode: 15.4.0
steps: steps:
- checkout - checkout
# attach to a workspace containing unsigned dmg # attach to a workspace containing unsigned dmg
@ -163,8 +225,7 @@ jobs:
- upload - upload
notarize-offline-chat-installer-macos: notarize-offline-chat-installer-macos:
macos: <<: *job-macos-executor
xcode: 15.4.0
steps: steps:
- checkout - checkout
- attach_workspace: - attach_workspace:
@ -203,8 +264,7 @@ jobs:
hdiutil detach /Volumes/gpt4all-installer-darwin hdiutil detach /Volumes/gpt4all-installer-darwin
build-online-chat-installer-macos: build-online-chat-installer-macos:
macos: <<: *job-macos-executor
xcode: 15.4.0
steps: steps:
- checkout - checkout
- run: - run:
@ -215,12 +275,11 @@ jobs:
- restore_cache: - restore_cache:
keys: keys:
- ccache-gpt4all-macos- - ccache-gpt4all-macos-
- run:
<<: *job-macos-install-deps
- run: - run:
name: Install Rosetta name: Install Rosetta
command: softwareupdate --install-rosetta --agree-to-license # needed for QtIFW command: softwareupdate --install-rosetta --agree-to-license # needed for QtIFW
- run:
name: Install dependencies
command: brew install ccache wget
- run: - run:
name: Installing Qt name: Installing Qt
command: | command: |
@ -255,6 +314,9 @@ jobs:
-DCMAKE_BUILD_TYPE=Release \ -DCMAKE_BUILD_TYPE=Release \
-DCMAKE_PREFIX_PATH:PATH=~/Qt/6.8.2/macos/lib/cmake \ -DCMAKE_PREFIX_PATH:PATH=~/Qt/6.8.2/macos/lib/cmake \
-DCMAKE_MAKE_PROGRAM:FILEPATH=~/Qt/Tools/Ninja/ninja \ -DCMAKE_MAKE_PROGRAM:FILEPATH=~/Qt/Tools/Ninja/ninja \
-DCMAKE_C_COMPILER=/opt/homebrew/opt/llvm/bin/clang \
-DCMAKE_CXX_COMPILER=/opt/homebrew/opt/llvm/bin/clang++ \
-DCMAKE_RANLIB=/usr/bin/ranlib \
-DCMAKE_C_COMPILER_LAUNCHER=ccache \ -DCMAKE_C_COMPILER_LAUNCHER=ccache \
-DCMAKE_CXX_COMPILER_LAUNCHER=ccache \ -DCMAKE_CXX_COMPILER_LAUNCHER=ccache \
-DBUILD_UNIVERSAL=ON \ -DBUILD_UNIVERSAL=ON \
@ -290,8 +352,7 @@ jobs:
- upload - upload
sign-online-chat-installer-macos: sign-online-chat-installer-macos:
macos: <<: *job-macos-executor
xcode: 15.4.0
steps: steps:
- checkout - checkout
# attach to a workspace containing unsigned dmg # attach to a workspace containing unsigned dmg
@ -325,8 +386,7 @@ jobs:
- upload - upload
notarize-online-chat-installer-macos: notarize-online-chat-installer-macos:
macos: <<: *job-macos-executor
xcode: 15.4.0
steps: steps:
- checkout - checkout
- attach_workspace: - attach_workspace:
@ -379,34 +439,7 @@ jobs:
keys: keys:
- ccache-gpt4all-linux-amd64- - ccache-gpt4all-linux-amd64-
- run: - run:
name: Setup Linux and Dependencies <<: *job-linux-install-chat-deps
command: |
# Prevent apt-get from interactively prompting for service restart
echo "\$nrconf{restart} = 'l'" | sudo tee /etc/needrestart/conf.d/90-autorestart.conf >/dev/null
wget -qO- "https://packages.lunarg.com/lunarg-signing-key-pub.asc" | sudo tee /etc/apt/trusted.gpg.d/lunarg.asc
wget -qO- "https://packages.lunarg.com/vulkan/1.3.290/lunarg-vulkan-1.3.290-jammy.list" | sudo tee /etc/apt/sources.list.d/lunarg-vulkan-1.3.290-jammy.list
wget "https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2204/x86_64/cuda-keyring_1.1-1_all.deb"
sudo dpkg -i cuda-keyring_1.1-1_all.deb
packages=(
bison build-essential ccache cuda-compiler-11-8 flex g++-12 gperf libcublas-dev-11-8 libfontconfig1
libfreetype6 libgl1-mesa-dev libmysqlclient21 libnvidia-compute-550-server libodbc2 libpq5 libwayland-dev
libx11-6 libx11-xcb1 libxcb-cursor0 libxcb-glx0 libxcb-icccm4 libxcb-image0 libxcb-keysyms1 libxcb-randr0
libxcb-render-util0 libxcb-shape0 libxcb-shm0 libxcb-sync1 libxcb-util1 libxcb-xfixes0 libxcb-xinerama0
libxcb-xkb1 libxcb1 libxext6 libxfixes3 libxi6 libxkbcommon-dev libxkbcommon-x11-0 libxrender1 patchelf
python3 vulkan-sdk
)
sudo apt-get update
sudo apt-get install -y "${packages[@]}"
- run:
name: Installing Qt
command: |
wget "https://qt.mirror.constant.com/archive/online_installers/4.8/qt-online-installer-linux-x64-4.8.1.run"
chmod +x qt-online-installer-linux-x64-4.8.1.run
./qt-online-installer-linux-x64-4.8.1.run --no-force-installations --no-default-installations \
--no-size-checking --default-answer --accept-licenses --confirm-command --accept-obligations \
--email "$QT_EMAIL" --password "$QT_PASSWORD" install \
qt.tools.cmake qt.tools.ifw.48 qt.tools.ninja qt.qt6.682.linux_gcc_64 qt.qt6.682.addons.qt5compat \
qt.qt6.682.debug_info extensions.qtpdf.682 qt.qt6.682.addons.qthttpserver
- run: - run:
name: Build linuxdeployqt name: Build linuxdeployqt
command: | command: |
@ -427,8 +460,10 @@ jobs:
~/Qt/Tools/CMake/bin/cmake \ ~/Qt/Tools/CMake/bin/cmake \
-S ../gpt4all-chat -B . \ -S ../gpt4all-chat -B . \
-DCMAKE_BUILD_TYPE=Release \ -DCMAKE_BUILD_TYPE=Release \
-DCMAKE_C_COMPILER=gcc-12 \ -DCMAKE_C_COMPILER=clang-19 \
-DCMAKE_CXX_COMPILER=g++-12 \ -DCMAKE_CXX_COMPILER=clang++-19 \
-DCMAKE_CXX_COMPILER_AR=ar \
-DCMAKE_CXX_COMPILER_RANLIB=ranlib \
-DCMAKE_C_COMPILER_LAUNCHER=ccache \ -DCMAKE_C_COMPILER_LAUNCHER=ccache \
-DCMAKE_CXX_COMPILER_LAUNCHER=ccache \ -DCMAKE_CXX_COMPILER_LAUNCHER=ccache \
-DCMAKE_CUDA_COMPILER_LAUNCHER=ccache \ -DCMAKE_CUDA_COMPILER_LAUNCHER=ccache \
@ -468,34 +503,7 @@ jobs:
keys: keys:
- ccache-gpt4all-linux-amd64- - ccache-gpt4all-linux-amd64-
- run: - run:
name: Setup Linux and Dependencies <<: *job-linux-install-chat-deps
command: |
# Prevent apt-get from interactively prompting for service restart
echo "\$nrconf{restart} = 'l'" | sudo tee /etc/needrestart/conf.d/90-autorestart.conf >/dev/null
wget -qO- "https://packages.lunarg.com/lunarg-signing-key-pub.asc" | sudo tee /etc/apt/trusted.gpg.d/lunarg.asc
wget -qO- "https://packages.lunarg.com/vulkan/1.3.290/lunarg-vulkan-1.3.290-jammy.list" | sudo tee /etc/apt/sources.list.d/lunarg-vulkan-1.3.290-jammy.list
wget "https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2204/x86_64/cuda-keyring_1.1-1_all.deb"
sudo dpkg -i cuda-keyring_1.1-1_all.deb
packages=(
bison build-essential ccache cuda-compiler-11-8 flex g++-12 gperf libcublas-dev-11-8 libfontconfig1
libfreetype6 libgl1-mesa-dev libmysqlclient21 libnvidia-compute-550-server libodbc2 libpq5 libwayland-dev
libx11-6 libx11-xcb1 libxcb-cursor0 libxcb-glx0 libxcb-icccm4 libxcb-image0 libxcb-keysyms1 libxcb-randr0
libxcb-render-util0 libxcb-shape0 libxcb-shm0 libxcb-sync1 libxcb-util1 libxcb-xfixes0 libxcb-xinerama0
libxcb-xkb1 libxcb1 libxext6 libxfixes3 libxi6 libxkbcommon-dev libxkbcommon-x11-0 libxrender1 patchelf
python3 vulkan-sdk
)
sudo apt-get update
sudo apt-get install -y "${packages[@]}"
- run:
name: Installing Qt
command: |
wget "https://qt.mirror.constant.com/archive/online_installers/4.8/qt-online-installer-linux-x64-4.8.1.run"
chmod +x qt-online-installer-linux-x64-4.8.1.run
./qt-online-installer-linux-x64-4.8.1.run --no-force-installations --no-default-installations \
--no-size-checking --default-answer --accept-licenses --confirm-command --accept-obligations \
--email "$QT_EMAIL" --password "$QT_PASSWORD" install \
qt.tools.cmake qt.tools.ifw.48 qt.tools.ninja qt.qt6.682.linux_gcc_64 qt.qt6.682.addons.qt5compat \
qt.qt6.682.debug_info extensions.qtpdf.682 qt.qt6.682.addons.qthttpserver
- run: - run:
name: Build linuxdeployqt name: Build linuxdeployqt
command: | command: |
@ -516,8 +524,10 @@ jobs:
~/Qt/Tools/CMake/bin/cmake \ ~/Qt/Tools/CMake/bin/cmake \
-S ../gpt4all-chat -B . \ -S ../gpt4all-chat -B . \
-DCMAKE_BUILD_TYPE=Release \ -DCMAKE_BUILD_TYPE=Release \
-DCMAKE_C_COMPILER=gcc-12 \ -DCMAKE_C_COMPILER=clang-19 \
-DCMAKE_CXX_COMPILER=g++-12 \ -DCMAKE_CXX_COMPILER=clang++-19 \
-DCMAKE_CXX_COMPILER_AR=ar \
-DCMAKE_CXX_COMPILER_RANLIB=ranlib \
-DCMAKE_C_COMPILER_LAUNCHER=ccache \ -DCMAKE_C_COMPILER_LAUNCHER=ccache \
-DCMAKE_CXX_COMPILER_LAUNCHER=ccache \ -DCMAKE_CXX_COMPILER_LAUNCHER=ccache \
-DCMAKE_CUDA_COMPILER_LAUNCHER=ccache \ -DCMAKE_CUDA_COMPILER_LAUNCHER=ccache \
@ -1109,34 +1119,7 @@ jobs:
keys: keys:
- ccache-gpt4all-linux-amd64- - ccache-gpt4all-linux-amd64-
- run: - run:
name: Setup Linux and Dependencies <<: *job-linux-install-chat-deps
command: |
# Prevent apt-get from interactively prompting for service restart
echo "\$nrconf{restart} = 'l'" | sudo tee /etc/needrestart/conf.d/90-autorestart.conf >/dev/null
wget -qO- "https://packages.lunarg.com/lunarg-signing-key-pub.asc" | sudo tee /etc/apt/trusted.gpg.d/lunarg.asc
wget -qO- "https://packages.lunarg.com/vulkan/1.3.290/lunarg-vulkan-1.3.290-jammy.list" | sudo tee /etc/apt/sources.list.d/lunarg-vulkan-1.3.290-jammy.list
wget "https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2204/x86_64/cuda-keyring_1.1-1_all.deb"
sudo dpkg -i cuda-keyring_1.1-1_all.deb
packages=(
bison build-essential ccache cuda-compiler-11-8 flex g++-12 gperf libcublas-dev-11-8 libfontconfig1
libfreetype6 libgl1-mesa-dev libmysqlclient21 libnvidia-compute-550-server libodbc2 libpq5 libwayland-dev
libx11-6 libx11-xcb1 libxcb-cursor0 libxcb-glx0 libxcb-icccm4 libxcb-image0 libxcb-keysyms1 libxcb-randr0
libxcb-render-util0 libxcb-shape0 libxcb-shm0 libxcb-sync1 libxcb-util1 libxcb-xfixes0 libxcb-xinerama0
libxcb-xkb1 libxcb1 libxext6 libxfixes3 libxi6 libxkbcommon-dev libxkbcommon-x11-0 libxrender1 python3
vulkan-sdk
)
sudo apt-get update
sudo apt-get install -y "${packages[@]}"
- run:
name: Installing Qt
command: |
wget "https://qt.mirror.constant.com/archive/online_installers/4.8/qt-online-installer-linux-x64-4.8.1.run"
chmod +x qt-online-installer-linux-x64-4.8.1.run
./qt-online-installer-linux-x64-4.8.1.run --no-force-installations --no-default-installations \
--no-size-checking --default-answer --accept-licenses --confirm-command --accept-obligations \
--email "$QT_EMAIL" --password "$QT_PASSWORD" install \
qt.tools.cmake qt.tools.ifw.48 qt.tools.ninja qt.qt6.682.linux_gcc_64 qt.qt6.682.addons.qt5compat \
qt.qt6.682.debug_info extensions.qtpdf.682 qt.qt6.682.addons.qthttpserver
- run: - run:
name: Build name: Build
no_output_timeout: 30m no_output_timeout: 30m
@ -1147,8 +1130,10 @@ jobs:
~/Qt/Tools/CMake/bin/cmake \ ~/Qt/Tools/CMake/bin/cmake \
-S gpt4all-chat -B build \ -S gpt4all-chat -B build \
-DCMAKE_BUILD_TYPE=Release \ -DCMAKE_BUILD_TYPE=Release \
-DCMAKE_C_COMPILER=gcc-12 \ -DCMAKE_C_COMPILER=clang-19 \
-DCMAKE_CXX_COMPILER=g++-12 \ -DCMAKE_CXX_COMPILER=clang++-19 \
-DCMAKE_CXX_COMPILER_AR=ar \
-DCMAKE_CXX_COMPILER_RANLIB=ranlib \
-DCMAKE_C_COMPILER_LAUNCHER=ccache \ -DCMAKE_C_COMPILER_LAUNCHER=ccache \
-DCMAKE_CXX_COMPILER_LAUNCHER=ccache \ -DCMAKE_CXX_COMPILER_LAUNCHER=ccache \
-DCMAKE_CUDA_COMPILER_LAUNCHER=ccache \ -DCMAKE_CUDA_COMPILER_LAUNCHER=ccache \
@ -1227,8 +1212,7 @@ jobs:
- ..\.ccache - ..\.ccache
build-gpt4all-chat-macos: build-gpt4all-chat-macos:
macos: <<: *job-macos-executor
xcode: 15.4.0
steps: steps:
- checkout - checkout
- run: - run:
@ -1239,12 +1223,11 @@ jobs:
- restore_cache: - restore_cache:
keys: keys:
- ccache-gpt4all-macos- - ccache-gpt4all-macos-
- run:
<<: *job-macos-install-deps
- run: - run:
name: Install Rosetta name: Install Rosetta
command: softwareupdate --install-rosetta --agree-to-license # needed for QtIFW command: softwareupdate --install-rosetta --agree-to-license # needed for QtIFW
- run:
name: Install dependencies
command: brew install ccache wget
- run: - run:
name: Installing Qt name: Installing Qt
command: | command: |
@ -1267,6 +1250,9 @@ jobs:
-DCMAKE_BUILD_TYPE=Release \ -DCMAKE_BUILD_TYPE=Release \
-DCMAKE_PREFIX_PATH:PATH=~/Qt/6.8.2/macos/lib/cmake \ -DCMAKE_PREFIX_PATH:PATH=~/Qt/6.8.2/macos/lib/cmake \
-DCMAKE_MAKE_PROGRAM:FILEPATH=~/Qt/Tools/Ninja/ninja \ -DCMAKE_MAKE_PROGRAM:FILEPATH=~/Qt/Tools/Ninja/ninja \
-DCMAKE_C_COMPILER=/opt/homebrew/opt/llvm/bin/clang \
-DCMAKE_CXX_COMPILER=/opt/homebrew/opt/llvm/bin/clang++ \
-DCMAKE_RANLIB=/usr/bin/ranlib \
-DCMAKE_C_COMPILER_LAUNCHER=ccache \ -DCMAKE_C_COMPILER_LAUNCHER=ccache \
-DCMAKE_CXX_COMPILER_LAUNCHER=ccache \ -DCMAKE_CXX_COMPILER_LAUNCHER=ccache \
-DBUILD_UNIVERSAL=ON \ -DBUILD_UNIVERSAL=ON \
@ -1334,22 +1320,7 @@ jobs:
keys: keys:
- ccache-gpt4all-linux-amd64- - ccache-gpt4all-linux-amd64-
- run: - run:
name: Set Python Version <<: *job-linux-install-backend-deps
command: pyenv global 3.11.2
- run:
name: Install dependencies
command: |
wget -qO- "https://packages.lunarg.com/lunarg-signing-key-pub.asc" | sudo tee /etc/apt/trusted.gpg.d/lunarg.asc
wget -qO- "https://packages.lunarg.com/vulkan/1.3.290/lunarg-vulkan-1.3.290-jammy.list" | sudo tee /etc/apt/sources.list.d/lunarg-vulkan-1.3.290-jammy.list
wget "https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2204/x86_64/cuda-keyring_1.1-1_all.deb"
sudo dpkg -i cuda-keyring_1.1-1_all.deb
packages=(
build-essential ccache cmake cuda-compiler-11-8 g++-12 libcublas-dev-11-8 libnvidia-compute-550-server
vulkan-sdk
)
sudo apt-get update
sudo apt-get install -y "${packages[@]}"
pip install setuptools wheel cmake
- run: - run:
name: Build C library name: Build C library
no_output_timeout: 30m no_output_timeout: 30m
@ -1358,15 +1329,17 @@ jobs:
git submodule update --init --recursive git submodule update --init --recursive
ccache -o "cache_dir=${PWD}/../.ccache" -o max_size=500M -p -z ccache -o "cache_dir=${PWD}/../.ccache" -o max_size=500M -p -z
cd gpt4all-backend cd gpt4all-backend
cmake -B build \ cmake -B build -G Ninja \
-DCMAKE_BUILD_TYPE=Release \ -DCMAKE_BUILD_TYPE=Release \
-DCMAKE_C_COMPILER=gcc-12 \ -DCMAKE_C_COMPILER=clang-19 \
-DCMAKE_CXX_COMPILER=g++-12 \ -DCMAKE_CXX_COMPILER=clang++-19 \
-DCMAKE_CXX_COMPILER_AR=ar \
-DCMAKE_CXX_COMPILER_RANLIB=ranlib \
-DCMAKE_C_COMPILER_LAUNCHER=ccache \ -DCMAKE_C_COMPILER_LAUNCHER=ccache \
-DCMAKE_CXX_COMPILER_LAUNCHER=ccache \ -DCMAKE_CXX_COMPILER_LAUNCHER=ccache \
-DCMAKE_CUDA_COMPILER_LAUNCHER=ccache \ -DCMAKE_CUDA_COMPILER_LAUNCHER=ccache \
-DKOMPUTE_OPT_DISABLE_VULKAN_VERSION_CHECK=ON \ -DKOMPUTE_OPT_DISABLE_VULKAN_VERSION_CHECK=ON \
-DCMAKE_CUDA_ARCHITECTURES='52-virtual;61-virtual;70-virtual;75-virtual' -DCMAKE_CUDA_ARCHITECTURES='50-virtual;52-virtual;61-virtual;70-virtual;75-virtual'
cmake --build build -j$(nproc) cmake --build build -j$(nproc)
ccache -s ccache -s
- run: - run:
@ -1387,18 +1360,17 @@ jobs:
- "*.whl" - "*.whl"
build-py-macos: build-py-macos:
macos: <<: *job-macos-executor
xcode: 15.4.0
resource_class: macos.m1.large.gen1
steps: steps:
- checkout - checkout
- restore_cache: - restore_cache:
keys: keys:
- ccache-gpt4all-macos- - ccache-gpt4all-macos-
- run:
<<: *job-macos-install-deps
- run: - run:
name: Install dependencies name: Install dependencies
command: | command: |
brew install ccache cmake
pip install setuptools wheel cmake pip install setuptools wheel cmake
- run: - run:
name: Build C library name: Build C library
@ -1409,6 +1381,9 @@ jobs:
cd gpt4all-backend cd gpt4all-backend
cmake -B build \ cmake -B build \
-DCMAKE_BUILD_TYPE=Release \ -DCMAKE_BUILD_TYPE=Release \
-DCMAKE_C_COMPILER=/opt/homebrew/opt/llvm/bin/clang \
-DCMAKE_CXX_COMPILER=/opt/homebrew/opt/llvm/bin/clang++ \
-DCMAKE_RANLIB=/usr/bin/ranlib \
-DCMAKE_C_COMPILER_LAUNCHER=ccache \ -DCMAKE_C_COMPILER_LAUNCHER=ccache \
-DCMAKE_CXX_COMPILER_LAUNCHER=ccache \ -DCMAKE_CXX_COMPILER_LAUNCHER=ccache \
-DBUILD_UNIVERSAL=ON \ -DBUILD_UNIVERSAL=ON \
@ -1483,7 +1458,7 @@ jobs:
-DCMAKE_CXX_COMPILER_LAUNCHER=ccache ` -DCMAKE_CXX_COMPILER_LAUNCHER=ccache `
-DCMAKE_CUDA_COMPILER_LAUNCHER=ccache ` -DCMAKE_CUDA_COMPILER_LAUNCHER=ccache `
-DKOMPUTE_OPT_DISABLE_VULKAN_VERSION_CHECK=ON ` -DKOMPUTE_OPT_DISABLE_VULKAN_VERSION_CHECK=ON `
-DCMAKE_CUDA_ARCHITECTURES='52-virtual;61-virtual;70-virtual;75-virtual' -DCMAKE_CUDA_ARCHITECTURES='50-virtual;52-virtual;61-virtual;70-virtual;75-virtual'
cmake --build build --parallel cmake --build build --parallel
ccache -s ccache -s
- run: - run:
@ -1537,18 +1512,7 @@ jobs:
keys: keys:
- ccache-gpt4all-linux-amd64- - ccache-gpt4all-linux-amd64-
- run: - run:
name: Install dependencies <<: *job-linux-install-backend-deps
command: |
wget -qO- "https://packages.lunarg.com/lunarg-signing-key-pub.asc" | sudo tee /etc/apt/trusted.gpg.d/lunarg.asc
wget -qO- "https://packages.lunarg.com/vulkan/1.3.290/lunarg-vulkan-1.3.290-jammy.list" | sudo tee /etc/apt/sources.list.d/lunarg-vulkan-1.3.290-jammy.list
wget "https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2204/x86_64/cuda-keyring_1.1-1_all.deb"
sudo dpkg -i cuda-keyring_1.1-1_all.deb
packages=(
build-essential ccache cmake cuda-compiler-11-8 g++-12 libcublas-dev-11-8 libnvidia-compute-550-server
vulkan-sdk
)
sudo apt-get update
sudo apt-get install -y "${packages[@]}"
- run: - run:
name: Build Libraries name: Build Libraries
no_output_timeout: 30m no_output_timeout: 30m
@ -1558,10 +1522,12 @@ jobs:
cd gpt4all-backend cd gpt4all-backend
mkdir -p runtimes/build mkdir -p runtimes/build
cd runtimes/build cd runtimes/build
cmake ../.. \ cmake ../.. -G Ninja \
-DCMAKE_BUILD_TYPE=Release \ -DCMAKE_BUILD_TYPE=Release \
-DCMAKE_C_COMPILER=gcc-12 \ -DCMAKE_C_COMPILER=clang-19 \
-DCMAKE_C_COMPILER=g++-12 \ -DCMAKE_CXX_COMPILER=clang++-19 \
-DCMAKE_CXX_COMPILER_AR=ar \
-DCMAKE_CXX_COMPILER_RANLIB=ranlib \
-DCMAKE_BUILD_TYPE=Release \ -DCMAKE_BUILD_TYPE=Release \
-DCMAKE_C_COMPILER_LAUNCHER=ccache \ -DCMAKE_C_COMPILER_LAUNCHER=ccache \
-DCMAKE_CXX_COMPILER_LAUNCHER=ccache \ -DCMAKE_CXX_COMPILER_LAUNCHER=ccache \
@ -1582,8 +1548,7 @@ jobs:
- runtimes/linux-x64/*.so - runtimes/linux-x64/*.so
build-bindings-backend-macos: build-bindings-backend-macos:
macos: <<: *job-macos-executor
xcode: 15.4.0
steps: steps:
- checkout - checkout
- run: - run:
@ -1595,9 +1560,7 @@ jobs:
keys: keys:
- ccache-gpt4all-macos- - ccache-gpt4all-macos-
- run: - run:
name: Install dependencies <<: *job-macos-install-deps
command: |
brew install ccache cmake
- run: - run:
name: Build Libraries name: Build Libraries
no_output_timeout: 30m no_output_timeout: 30m
@ -1608,6 +1571,9 @@ jobs:
cd runtimes/build cd runtimes/build
cmake ../.. \ cmake ../.. \
-DCMAKE_BUILD_TYPE=Release \ -DCMAKE_BUILD_TYPE=Release \
-DCMAKE_C_COMPILER=/opt/homebrew/opt/llvm/bin/clang \
-DCMAKE_CXX_COMPILER=/opt/homebrew/opt/llvm/bin/clang++ \
-DCMAKE_RANLIB=/usr/bin/ranlib \
-DCMAKE_C_COMPILER_LAUNCHER=ccache \ -DCMAKE_C_COMPILER_LAUNCHER=ccache \
-DCMAKE_CXX_COMPILER_LAUNCHER=ccache \ -DCMAKE_CXX_COMPILER_LAUNCHER=ccache \
-DBUILD_UNIVERSAL=ON \ -DBUILD_UNIVERSAL=ON \
@ -1725,8 +1691,7 @@ jobs:
- runtimes/linux-x64/*-*.so - runtimes/linux-x64/*-*.so
build-nodejs-macos: build-nodejs-macos:
macos: <<: *job-macos-executor
xcode: 15.4.0
steps: steps:
- checkout - checkout
- attach_workspace: - attach_workspace:
@ -1903,22 +1868,22 @@ workflows:
context: gpt4all context: gpt4all
- build-offline-chat-installer-linux: - build-offline-chat-installer-linux:
context: gpt4all context: gpt4all
- sign-online-chat-installer-macos: - sign-offline-chat-installer-macos:
context: gpt4all context: gpt4all
requires: requires:
- build-online-chat-installer-macos - build-offline-chat-installer-macos
- notarize-online-chat-installer-macos: - notarize-offline-chat-installer-macos:
context: gpt4all context: gpt4all
requires: requires:
- sign-online-chat-installer-macos - sign-offline-chat-installer-macos
- sign-online-chat-installer-windows: - sign-offline-chat-installer-windows:
context: gpt4all context: gpt4all
requires: requires:
- build-online-chat-installer-windows - build-offline-chat-installer-windows
- sign-online-chat-installer-windows-arm: - sign-offline-chat-installer-windows-arm:
context: gpt4all context: gpt4all
requires: requires:
- build-online-chat-installer-windows-arm - build-offline-chat-installer-windows-arm
build-chat-installers-release: build-chat-installers-release:
# only run on main branch tags that start with 'v' and a digit # only run on main branch tags that start with 'v' and a digit
when: when:

View File

@ -72,6 +72,6 @@ Discord: `@Tim453`
- Flatpak - Flatpak
Jack ([@wuodoo](https://github.com/wuodoo))<br/> Jack ([@wuodoo](https://github.com/wuodoo))<br/>
E-mail: 2296103047@qq.com><br/> E-mail: 2296103047@qq.com<br/>
Discord: `@mikage` Discord: `@mikage`
- zh\_CN translation - zh\_CN translation

View File

@ -35,6 +35,11 @@ GPT4All is made possible by our compute partner <a href="https://www.paperspace.
<img src="gpt4all-bindings/python/docs/assets/windows.png" style="height: 1em; width: auto" /> Windows Installer <img src="gpt4all-bindings/python/docs/assets/windows.png" style="height: 1em; width: auto" /> Windows Installer
</a> &mdash; </a> &mdash;
</p> </p>
<p>
&mdash; <a href="https://gpt4all.io/installers/gpt4all-installer-win64-arm.exe">
<img src="gpt4all-bindings/python/docs/assets/windows.png" style="height: 1em; width: auto" /> Windows ARM Installer
</a> &mdash;
</p>
<p> <p>
&mdash; <a href="https://gpt4all.io/installers/gpt4all-installer-darwin.dmg"> &mdash; <a href="https://gpt4all.io/installers/gpt4all-installer-darwin.dmg">
<img src="gpt4all-bindings/python/docs/assets/mac.png" style="height: 1em; width: auto" /> macOS Installer <img src="gpt4all-bindings/python/docs/assets/mac.png" style="height: 1em; width: auto" /> macOS Installer
@ -46,10 +51,16 @@ GPT4All is made possible by our compute partner <a href="https://www.paperspace.
</a> &mdash; </a> &mdash;
</p> </p>
<p> <p>
Windows and Linux require Intel Core i3 2nd Gen / AMD Bulldozer, or better. x86-64 only, no ARM. The Windows and Linux builds require Intel Core i3 2nd Gen / AMD Bulldozer, or better.
</p> </p>
<p> <p>
macOS requires Monterey 12.6 or newer. Best results with Apple Silicon M-series processors. The Windows ARM build supports Qualcomm Snapdragon and Microsoft SQ1/SQ2 processors.
</p>
<p>
The Linux build is x86-64 only (no ARM).
</p>
<p>
The macOS build requires Monterey 12.6 or newer. Best results with Apple Silicon M-series processors.
</p> </p>
See the full [System Requirements](gpt4all-chat/system_requirements.md) for more details. See the full [System Requirements](gpt4all-chat/system_requirements.md) for more details.

View File

@ -69,7 +69,7 @@ if (LLMODEL_CUDA)
cmake_minimum_required(VERSION 3.18) # for CMAKE_CUDA_ARCHITECTURES cmake_minimum_required(VERSION 3.18) # for CMAKE_CUDA_ARCHITECTURES
# Defaults must be set before enable_language(CUDA). # Defaults must be set before enable_language(CUDA).
# Keep this in sync with the arch list in ggml/src/CMakeLists.txt. # Keep this in sync with the arch list in ggml/src/CMakeLists.txt (plus 5.0 for non-F16 branch).
if (NOT DEFINED CMAKE_CUDA_ARCHITECTURES) if (NOT DEFINED CMAKE_CUDA_ARCHITECTURES)
# 52 == lowest CUDA 12 standard # 52 == lowest CUDA 12 standard
# 60 == f16 CUDA intrinsics # 60 == f16 CUDA intrinsics
@ -78,7 +78,7 @@ if (LLMODEL_CUDA)
if (GGML_CUDA_F16 OR GGML_CUDA_DMMV_F16) if (GGML_CUDA_F16 OR GGML_CUDA_DMMV_F16)
set(CMAKE_CUDA_ARCHITECTURES "60;61;70;75") # needed for f16 CUDA intrinsics set(CMAKE_CUDA_ARCHITECTURES "60;61;70;75") # needed for f16 CUDA intrinsics
else() else()
set(CMAKE_CUDA_ARCHITECTURES "52;61;70;75") # lowest CUDA 12 standard + lowest for integer intrinsics set(CMAKE_CUDA_ARCHITECTURES "50;52;61;70;75") # lowest CUDA 12 standard + lowest for integer intrinsics
#set(CMAKE_CUDA_ARCHITECTURES "OFF") # use this to compile much faster, but only F16 models work #set(CMAKE_CUDA_ARCHITECTURES "OFF") # use this to compile much faster, but only F16 models work
endif() endif()
endif() endif()

@ -1 +1 @@
Subproject commit 3ebb3603e807d74a16f061c46d2925a1653e7a93 Subproject commit 11f734c3b0334dbae4823b4a7467764e447fc6d6

View File

@ -53,6 +53,7 @@ static const std::vector<const char *> KNOWN_ARCHES {
"gpt2", "gpt2",
// "gptj", -- no inference code // "gptj", -- no inference code
"gptneox", "gptneox",
"granite",
"granitemoe", "granitemoe",
"mpt", "mpt",
"baichuan", "baichuan",

View File

@ -140,9 +140,14 @@ const std::vector<LLModel::Implementation> &LLModel::Implementation::implementat
std::string path; std::string path;
// Split the paths string by the delimiter and process each path. // Split the paths string by the delimiter and process each path.
while (std::getline(ss, path, ';')) { while (std::getline(ss, path, ';')) {
std::u8string u8_path(path.begin(), path.end()); fs::directory_iterator iter;
try {
iter = fs::directory_iterator(std::u8string(path.begin(), path.end()));
} catch (const fs::filesystem_error &) {
continue; // skip nonexistent path
}
// Iterate over all libraries // Iterate over all libraries
for (const auto &f : fs::directory_iterator(u8_path)) { for (const auto &f : iter) {
const fs::path &p = f.path(); const fs::path &p = f.path();
if (p.extension() != LIB_FILE_EXT) continue; if (p.extension() != LIB_FILE_EXT) continue;

View File

@ -4,13 +4,30 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/). The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/).
## [3.10.0] - 2025-02-24
### Added
- Whitelist Granite (non-MoE) model architecture (by [@ThiloteE](https://github.com/ThiloteE) in [#3487](https://github.com/nomic-ai/gpt4all/pull/3487))
- Add support for CUDA compute 5.0 GPUs such as the GTX 750 ([#3499](https://github.com/nomic-ai/gpt4all/pull/3499))
- Add a Remote Providers tab to the Add Model page ([#3506](https://github.com/nomic-ai/gpt4all/pull/3506))
### Changed
- Substitute prettier default templates for OLMoE 7B 0924/0125 and Granite 3.1 3B/8B (by [@ThiloteE](https://github.com/ThiloteE) in [#3471](https://github.com/nomic-ai/gpt4all/pull/3471))
- Build with LLVM Clang 19 on macOS and Ubuntu ([#3500](https://github.com/nomic-ai/gpt4all/pull/3500))
### Fixed
- Fix several potential crashes ([#3465](https://github.com/nomic-ai/gpt4all/pull/3465))
- Fix visual spacing issues with deepseek models ([#3470](https://github.com/nomic-ai/gpt4all/pull/3470))
- Add missing strings to Italian translation (by [@Harvester62](https://github.com/Harvester62) in [#3496](https://github.com/nomic-ai/gpt4all/pull/3496))
- Update Simplified Chinese translation (by [@Junior2Ran](https://github.com/Junior2Ran) in [#3467](https://github.com/nomic-ai/pull/3467))
## [3.9.0] - 2025-02-04 ## [3.9.0] - 2025-02-04
### Added ### Added
- Whitelist OLMoE and Granite MoE model architectures (no Vulkan) (by [@ThiloteE](https://github.com/ThiloteE) in [#3449](https://github.com/nomic-ai/gpt4all/pull/3449)) - Whitelist OLMoE and Granite MoE model architectures (no Vulkan) (by [@ThiloteE](https://github.com/ThiloteE) in [#3449](https://github.com/nomic-ai/gpt4all/pull/3449))
### Fixed ### Fixed
- Fix "index N is not a prompt" when using LocalDocs with reasoning ([#3451](https://github.com/nomic-ai/gpt4all/pull/3451) - Fix "index N is not a prompt" when using LocalDocs with reasoning ([#3451](https://github.com/nomic-ai/gpt4all/pull/3451))
- Work around rendering artifacts on Snapdragon SoCs with Windows ([#3450](https://github.com/nomic-ai/gpt4all/pull/3450)) - Work around rendering artifacts on Snapdragon SoCs with Windows ([#3450](https://github.com/nomic-ai/gpt4all/pull/3450))
- Prevent DeepSeek-R1 reasoning from appearing in chat names and follow-up questions ([#3458](https://github.com/nomic-ai/gpt4all/pull/3458)) - Prevent DeepSeek-R1 reasoning from appearing in chat names and follow-up questions ([#3458](https://github.com/nomic-ai/gpt4all/pull/3458))
- Fix LocalDocs crash on Windows ARM when reading PDFs ([#3460](https://github.com/nomic-ai/gpt4all/pull/3460)) - Fix LocalDocs crash on Windows ARM when reading PDFs ([#3460](https://github.com/nomic-ai/gpt4all/pull/3460))
@ -295,6 +312,7 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/).
- Fix several Vulkan resource management issues ([#2694](https://github.com/nomic-ai/gpt4all/pull/2694)) - Fix several Vulkan resource management issues ([#2694](https://github.com/nomic-ai/gpt4all/pull/2694))
- Fix crash/hang when some models stop generating, by showing special tokens ([#2701](https://github.com/nomic-ai/gpt4all/pull/2701)) - Fix crash/hang when some models stop generating, by showing special tokens ([#2701](https://github.com/nomic-ai/gpt4all/pull/2701))
[3.10.0]: https://github.com/nomic-ai/gpt4all/compare/v3.9.0...v3.10.0
[3.9.0]: https://github.com/nomic-ai/gpt4all/compare/v3.8.0...v3.9.0 [3.9.0]: https://github.com/nomic-ai/gpt4all/compare/v3.8.0...v3.9.0
[3.8.0]: https://github.com/nomic-ai/gpt4all/compare/v3.7.0...v3.8.0 [3.8.0]: https://github.com/nomic-ai/gpt4all/compare/v3.7.0...v3.8.0
[3.7.0]: https://github.com/nomic-ai/gpt4all/compare/v3.6.1...v3.7.0 [3.7.0]: https://github.com/nomic-ai/gpt4all/compare/v3.6.1...v3.7.0

View File

@ -3,10 +3,10 @@ cmake_minimum_required(VERSION 3.25) # for try_compile SOURCE_FROM_VAR
include(../common/common.cmake) include(../common/common.cmake)
set(APP_VERSION_MAJOR 3) set(APP_VERSION_MAJOR 3)
set(APP_VERSION_MINOR 9) set(APP_VERSION_MINOR 10)
set(APP_VERSION_PATCH 0) set(APP_VERSION_PATCH 1)
set(APP_VERSION_BASE "${APP_VERSION_MAJOR}.${APP_VERSION_MINOR}.${APP_VERSION_PATCH}") set(APP_VERSION_BASE "${APP_VERSION_MAJOR}.${APP_VERSION_MINOR}.${APP_VERSION_PATCH}")
set(APP_VERSION "${APP_VERSION_BASE}") set(APP_VERSION "${APP_VERSION_BASE}-dev0")
project(gpt4all VERSION ${APP_VERSION_BASE} LANGUAGES CXX C) project(gpt4all VERSION ${APP_VERSION_BASE} LANGUAGES CXX C)
@ -104,7 +104,7 @@ elseif (GPT4ALL_USE_QTPDF MATCHES "^(ON|AUTO)$")
set(GPT4ALL_USING_QTPDF ON) set(GPT4ALL_USING_QTPDF ON)
list(APPEND GPT4ALL_QT_COMPONENTS Pdf) list(APPEND GPT4ALL_QT_COMPONENTS Pdf)
endif() endif()
find_package(Qt6 6.5 COMPONENTS ${GPT4ALL_QT_COMPONENTS} REQUIRED) find_package(Qt6 6.8 COMPONENTS ${GPT4ALL_QT_COMPONENTS} REQUIRED)
if (QT_KNOWN_POLICY_QTP0004) if (QT_KNOWN_POLICY_QTP0004)
qt_policy(SET QTP0004 NEW) # generate extra qmldir files on Qt 6.8+ qt_policy(SET QTP0004 NEW) # generate extra qmldir files on Qt 6.8+
@ -266,6 +266,7 @@ qt_add_qml_module(chat
qml/AddModelView.qml qml/AddModelView.qml
qml/AddGPT4AllModelView.qml qml/AddGPT4AllModelView.qml
qml/AddHFModelView.qml qml/AddHFModelView.qml
qml/AddRemoteModelView.qml
qml/ApplicationSettings.qml qml/ApplicationSettings.qml
qml/ChatDrawer.qml qml/ChatDrawer.qml
qml/ChatCollapsibleItem.qml qml/ChatCollapsibleItem.qml
@ -314,6 +315,7 @@ qt_add_qml_module(chat
qml/MyTextField.qml qml/MyTextField.qml
qml/MyToolButton.qml qml/MyToolButton.qml
qml/MyWelcomeButton.qml qml/MyWelcomeButton.qml
qml/RemoteModelCard.qml
RESOURCES RESOURCES
icons/antenna_1.svg icons/antenna_1.svg
icons/antenna_2.svg icons/antenna_2.svg
@ -344,6 +346,7 @@ qt_add_qml_module(chat
icons/gpt4all-48.png icons/gpt4all-48.png
icons/gpt4all.svg icons/gpt4all.svg
icons/gpt4all_transparent.svg icons/gpt4all_transparent.svg
icons/groq.svg
icons/home.svg icons/home.svg
icons/image.svg icons/image.svg
icons/info.svg icons/info.svg
@ -351,12 +354,14 @@ qt_add_qml_module(chat
icons/left_panel_open.svg icons/left_panel_open.svg
icons/local-docs.svg icons/local-docs.svg
icons/models.svg icons/models.svg
icons/mistral.svg
icons/network.svg icons/network.svg
icons/nomic_logo.svg icons/nomic_logo.svg
icons/notes.svg icons/notes.svg
icons/paperclip.svg icons/paperclip.svg
icons/plus.svg icons/plus.svg
icons/plus_circle.svg icons/plus_circle.svg
icons/openai.svg
icons/recycle.svg icons/recycle.svg
icons/regenerate.svg icons/regenerate.svg
icons/search.svg icons/search.svg

View File

@ -0,0 +1,3 @@
<?xml version="1.0" encoding="utf-8" ?>
<!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd">
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 26.3 26.3"><defs><style>.cls-1{fill:#f05237;}.cls-2{fill:#fff;}</style></defs><g id="Layer_2" data-name="Layer 2"><g id="Content"><circle class="cls-1" cx="13.15" cy="13.15" r="13.15"/><path class="cls-2" d="M13.17,6.88a4.43,4.43,0,0,0,0,8.85h1.45V14.07H13.17a2.77,2.77,0,1,1,2.77-2.76v4.07a2.74,2.74,0,0,1-4.67,2L10.1,18.51a4.37,4.37,0,0,0,3.07,1.29h.06a4.42,4.42,0,0,0,4.36-4.4V11.2a4.43,4.43,0,0,0-4.42-4.32"/></g></g></svg>

After

Width:  |  Height:  |  Size: 620 B

View File

@ -0,0 +1 @@
<svg viewBox="0 0 512 512" xmlns="http://www.w3.org/2000/svg" fill-rule="evenodd" clip-rule="evenodd" stroke-linejoin="round" stroke-miterlimit="2"><path d="M189.08 303.228H94.587l.044-94.446h94.497l-.048 94.446z" fill="#1c1c1b" fill-rule="nonzero"/><path d="M283.528 397.674h-94.493l.044-94.446h94.496l-.047 94.446z" fill="#1c1c1b" fill-rule="nonzero"/><path d="M283.575 303.228H189.08l.046-94.446h94.496l-.047 94.446z" fill="#1c1c1b" fill-rule="nonzero"/><path d="M378.07 303.228h-94.495l.044-94.446h94.498l-.047 94.446zM189.128 208.779H94.633l.044-94.448h94.498l-.047 94.448zM378.115 208.779h-94.494l.045-94.448h94.496l-.047 94.448zM94.587 303.227H.093l.044-96.017h94.496l-.046 96.017z" fill="#1c1c1b" fill-rule="nonzero"/><path d="M94.633 208.779H.138l.046-94.448H94.68l-.047 94.448z" fill="#1c1c1b" fill-rule="nonzero"/><path d="M94.68 115.902H.185L.23 19.885h94.498l-.047 96.017zM472.657 114.331h-94.495l.044-94.446h94.497l-.046 94.446zM94.54 399.244H.046l.044-97.588h94.497l-.047 97.588z" fill="#1c1c1b" fill-rule="nonzero"/><path d="M94.495 492.123H0l.044-94.446H94.54l-.045 94.446zM472.563 303.228H378.07l.044-94.446h94.496l-.047 94.446zM472.61 208.779h-94.495l.044-94.448h94.498l-.047 94.448z" fill="#1c1c1b" fill-rule="nonzero"/><path d="M472.517 397.674h-94.494l.044-94.446h94.497l-.047 94.446z" fill="#1c1c1b" fill-rule="nonzero"/><path d="M472.47 492.121h-94.493l.044-96.017h94.496l-.047 96.017z" fill="#1c1c1b" fill-rule="nonzero"/><path d="M228.375 303.22h-96.061l.046-94.446h96.067l-.052 94.446z" fill="#ff7000" fill-rule="nonzero"/><path d="M322.827 397.666h-94.495l.044-96.018h94.498l-.047 96.018z" fill="#ff4900" fill-rule="nonzero"/><path d="M324.444 303.22h-97.636l.046-94.446h97.638l-.048 94.446z" fill="#ff7000" fill-rule="nonzero"/><path d="M418.938 303.22h-96.064l.045-94.446h96.066l-.047 94.446z" fill="#ff7000" fill-rule="nonzero"/><path d="M228.423 208.77H132.36l.045-94.445h96.066l-.05 94.446zM418.985 208.77H322.92l.044-94.445h96.069l-.048 94.446z" fill="#ffa300" fill-rule="nonzero"/><path d="M133.883 304.79H39.392l.044-96.017h94.496l-.049 96.017z" fill="#ff7000" fill-rule="nonzero"/><path d="M133.929 208.77H39.437l.044-95.445h94.496l-.048 95.445z" fill="#ffa300" fill-rule="nonzero"/><path d="M133.976 114.325H39.484l.044-94.448h94.497l-.05 94.448zM511.954 115.325h-94.493l.044-95.448h94.497l-.048 95.448z" fill="#ffce00" fill-rule="nonzero"/><path d="M133.836 399.667H39.345l.044-96.447h94.496l-.049 96.447z" fill="#ff4900" fill-rule="nonzero"/><path d="M133.79 492.117H39.3l.044-94.448h94.496l-.049 94.448z" fill="#ff0107" fill-rule="nonzero"/><path d="M511.862 303.22h-94.495l.046-94.446h94.496l-.047 94.446z" fill="#ff7000" fill-rule="nonzero"/><path d="M511.907 208.77h-94.493l.044-94.445h94.496l-.047 94.446z" fill="#ffa300" fill-rule="nonzero"/><path d="M511.815 398.666h-94.493l.044-95.447h94.496l-.047 95.447z" fill="#ff4900" fill-rule="nonzero"/><path d="M511.77 492.117h-94.496l.046-94.448h94.496l-.047 94.448z" fill="#ff0107" fill-rule="nonzero"/></svg>

After

Width:  |  Height:  |  Size: 2.9 KiB

View File

@ -0,0 +1,2 @@
<?xml version="1.0" encoding="utf-8"?><!-- Uploaded to: SVG Repo, www.svgrepo.com, Generator: SVG Repo Mixer Tools -->
<svg fill="#000000" width="800px" height="800px" viewBox="0 0 24 24" role="img" xmlns="http://www.w3.org/2000/svg"><title>OpenAI icon</title><path d="M22.2819 9.8211a5.9847 5.9847 0 0 0-.5157-4.9108 6.0462 6.0462 0 0 0-6.5098-2.9A6.0651 6.0651 0 0 0 4.9807 4.1818a5.9847 5.9847 0 0 0-3.9977 2.9 6.0462 6.0462 0 0 0 .7427 7.0966 5.98 5.98 0 0 0 .511 4.9107 6.051 6.051 0 0 0 6.5146 2.9001A5.9847 5.9847 0 0 0 13.2599 24a6.0557 6.0557 0 0 0 5.7718-4.2058 5.9894 5.9894 0 0 0 3.9977-2.9001 6.0557 6.0557 0 0 0-.7475-7.0729zm-9.022 12.6081a4.4755 4.4755 0 0 1-2.8764-1.0408l.1419-.0804 4.7783-2.7582a.7948.7948 0 0 0 .3927-.6813v-6.7369l2.02 1.1686a.071.071 0 0 1 .038.052v5.5826a4.504 4.504 0 0 1-4.4945 4.4944zm-9.6607-4.1254a4.4708 4.4708 0 0 1-.5346-3.0137l.142.0852 4.783 2.7582a.7712.7712 0 0 0 .7806 0l5.8428-3.3685v2.3324a.0804.0804 0 0 1-.0332.0615L9.74 19.9502a4.4992 4.4992 0 0 1-6.1408-1.6464zM2.3408 7.8956a4.485 4.485 0 0 1 2.3655-1.9728V11.6a.7664.7664 0 0 0 .3879.6765l5.8144 3.3543-2.0201 1.1685a.0757.0757 0 0 1-.071 0l-4.8303-2.7865A4.504 4.504 0 0 1 2.3408 7.872zm16.5963 3.8558L13.1038 8.364 15.1192 7.2a.0757.0757 0 0 1 .071 0l4.8303 2.7913a4.4944 4.4944 0 0 1-.6765 8.1042v-5.6772a.79.79 0 0 0-.407-.667zm2.0107-3.0231l-.142-.0852-4.7735-2.7818a.7759.7759 0 0 0-.7854 0L9.409 9.2297V6.8974a.0662.0662 0 0 1 .0284-.0615l4.8303-2.7866a4.4992 4.4992 0 0 1 6.6802 4.66zM8.3065 12.863l-2.02-1.1638a.0804.0804 0 0 1-.038-.0567V6.0742a4.4992 4.4992 0 0 1 7.3757-3.4537l-.142.0805L8.704 5.459a.7948.7948 0 0 0-.3927.6813zm1.0976-2.3654l2.602-1.4998 2.6069 1.4998v2.9994l-2.5974 1.4997-2.6067-1.4997Z"/></svg>

After

Width:  |  Height:  |  Size: 1.7 KiB

View File

@ -1,16 +1,15 @@
## Latest News ## Latest News
GPT4All v3.8.0 was released on January 30th. Changes include: GPT4All v3.10.0 was released on February 24th. Changes include:
* **Native DeepSeek-R1-Distill Support:** GPT4All now has robust support for the DeepSeek-R1 family of distillations. * **Remote Models:**
* Several model variants are now available on the downloads page. * The Add Model page now has a dedicated tab for remote model providers.
* Reasoning (wrapped in "think" tags) is displayed similarly to the Reasoner model. * Groq, OpenAI, and Mistral remote models are now easier to configure.
* The DeepSeek-R1 Qwen pretokenizer is now supported, resolving the loading failure in previous versions. * **CUDA Compatibility:** GPUs with CUDA compute capability 5.0 such as the GTX 750 are now supported by the CUDA backend.
* The model is now configured with a GPT4All-compatible prompt template by default. * **New Model:** The non-MoE Granite model is now supported.
* **Chat Templating Overhaul:** The template parser has been *completely* replaced with one that has much better compatibility with common models. * **Translation Updates:**
* **Code Interpreter Fixes:** * The Italian translation has been updated.
* An issue preventing the code interpreter from logging a single string in v3.7.0 has been fixed. * The Simplified Chinese translation has been significantly improved.
* The UI no longer freezes while the code interpreter is running a computation. * **Better Chat Templates:** The default chat templates for OLMoE 7B 0924/0125 and Granite 3.1 3B/8B have been improved.
* **Local Server Fixes:** * **Whitespace Fixes:** DeepSeek-R1-based models now have better whitespace behavior in their output.
* An issue preventing the server from using LocalDocs after the first request since v3.5.0 has been fixed. * **Crash Fixes:** Several issues that could potentially cause GPT4All to crash have been fixed.
* System messages are now correctly hidden from the message history.

View File

@ -268,5 +268,15 @@
"version": "3.8.0", "version": "3.8.0",
"notes": "* **Native DeepSeek-R1-Distill Support:** GPT4All now has robust support for the DeepSeek-R1 family of distillations.\n * Several model variants are now available on the downloads page.\n * Reasoning (wrapped in \"think\" tags) is displayed similarly to the Reasoner model.\n * The DeepSeek-R1 Qwen pretokenizer is now supported, resolving the loading failure in previous versions.\n * The model is now configured with a GPT4All-compatible prompt template by default.\n* **Chat Templating Overhaul:** The template parser has been *completely* replaced with one that has much better compatibility with common models.\n* **Code Interpreter Fixes:**\n * An issue preventing the code interpreter from logging a single string in v3.7.0 has been fixed.\n * The UI no longer freezes while the code interpreter is running a computation.\n* **Local Server Fixes:**\n * An issue preventing the server from using LocalDocs after the first request since v3.5.0 has been fixed.\n * System messages are now correctly hidden from the message history.\n", "notes": "* **Native DeepSeek-R1-Distill Support:** GPT4All now has robust support for the DeepSeek-R1 family of distillations.\n * Several model variants are now available on the downloads page.\n * Reasoning (wrapped in \"think\" tags) is displayed similarly to the Reasoner model.\n * The DeepSeek-R1 Qwen pretokenizer is now supported, resolving the loading failure in previous versions.\n * The model is now configured with a GPT4All-compatible prompt template by default.\n* **Chat Templating Overhaul:** The template parser has been *completely* replaced with one that has much better compatibility with common models.\n* **Code Interpreter Fixes:**\n * An issue preventing the code interpreter from logging a single string in v3.7.0 has been fixed.\n * The UI no longer freezes while the code interpreter is running a computation.\n* **Local Server Fixes:**\n * An issue preventing the server from using LocalDocs after the first request since v3.5.0 has been fixed.\n * System messages are now correctly hidden from the message history.\n",
"contributors": "* Jared Van Bortel (Nomic AI)\n* Adam Treat (Nomic AI)\n* ThiloteE (`@ThiloteE`)" "contributors": "* Jared Van Bortel (Nomic AI)\n* Adam Treat (Nomic AI)\n* ThiloteE (`@ThiloteE`)"
},
{
"version": "3.9.0",
"notes": "* **LocalDocs Fix:** LocalDocs no longer shows an error on later messages with reasoning models.\n* **DeepSeek Fix:** DeepSeek-R1 reasoning (in 'think' tags) no longer appears in chat names and follow-up questions.\n* **Windows ARM Improvements:**\n * Graphical artifacts on some SoCs have been fixed.\n * A crash when adding a collection of PDFs to LocalDocs has been fixed.\n* **Template Parser Fixes:** Chat templates containing an unclosed comment no longer freeze GPT4All.\n* **New Models:** OLMoE and Granite MoE models are now supported.\n",
"contributors": "* Jared Van Bortel (Nomic AI)\n* Adam Treat (Nomic AI)\n* ThiloteE (`@ThiloteE`)"
},
{
"version": "3.10.0",
"notes": "* **Remote Models:**\n * The Add Model page now has a dedicated tab for remote model providers.\n * Groq, OpenAI, and Mistral remote models are now easier to configure.\n* **CUDA Compatibility:** GPUs with CUDA compute capability 5.0 such as the GTX 750 are now supported by the CUDA backend.\n* **New Model:** The non-MoE Granite model is now supported.\n* **Translation Updates:**\n * The Italian translation has been updated.\n * The Simplified Chinese translation has been significantly improved.\n* **Better Chat Templates:** The default chat templates for OLMoE 7B 0924/0125 and Granite 3.1 3B/8B have been improved.\n* **Whitespace Fixes:** DeepSeek-R1-based models now have better whitespace behavior in their output.\n* **Crash Fixes:** Several issues that could potentially cause GPT4All to crash have been fixed.\n",
"contributors": "* Jared Van Bortel (Nomic AI)\n* Adam Treat (Nomic AI)\n* ThiloteE (`@ThiloteE`)\n* Lil Bob (`@Junior2Ran`)\n* Riccardo Giovanetti (`@Harvester62`)"
} }
] ]

View File

@ -204,7 +204,7 @@ ColumnLayout {
Layout.minimumWidth: 200 Layout.minimumWidth: 200
Layout.fillWidth: true Layout.fillWidth: true
Layout.alignment: Qt.AlignTop | Qt.AlignHCenter Layout.alignment: Qt.AlignTop | Qt.AlignHCenter
visible: !isOnline && !installed && !calcHash && downloadError === "" visible: !installed && !calcHash && downloadError === ""
Accessible.description: qsTr("Stop/restart/start the download") Accessible.description: qsTr("Stop/restart/start the download")
onClicked: { onClicked: {
if (!isDownloading) { if (!isDownloading) {
@ -230,52 +230,6 @@ ColumnLayout {
} }
} }
MySettingsButton {
id: installButton
visible: !installed && isOnline
Layout.topMargin: 20
Layout.leftMargin: 20
Layout.minimumWidth: 200
Layout.fillWidth: true
Layout.alignment: Qt.AlignTop | Qt.AlignHCenter
text: qsTr("Install")
font.pixelSize: theme.fontSizeLarge
onClicked: {
var apiKeyText = apiKey.text.trim(),
baseUrlText = baseUrl.text.trim(),
modelNameText = modelName.text.trim();
var apiKeyOk = apiKeyText !== "",
baseUrlOk = !isCompatibleApi || baseUrlText !== "",
modelNameOk = !isCompatibleApi || modelNameText !== "";
if (!apiKeyOk)
apiKey.showError();
if (!baseUrlOk)
baseUrl.showError();
if (!modelNameOk)
modelName.showError();
if (!apiKeyOk || !baseUrlOk || !modelNameOk)
return;
if (!isCompatibleApi)
Download.installModel(
filename,
apiKeyText,
);
else
Download.installCompatibleModel(
modelNameText,
apiKeyText,
baseUrlText,
);
}
Accessible.role: Accessible.Button
Accessible.name: qsTr("Install")
Accessible.description: qsTr("Install online model")
}
ColumnLayout { ColumnLayout {
spacing: 0 spacing: 0
Label { Label {
@ -390,69 +344,6 @@ ColumnLayout {
Accessible.description: qsTr("Displayed when the file hash is being calculated") Accessible.description: qsTr("Displayed when the file hash is being calculated")
} }
} }
MyTextField {
id: apiKey
visible: !installed && isOnline
Layout.topMargin: 20
Layout.leftMargin: 20
Layout.minimumWidth: 200
Layout.alignment: Qt.AlignTop | Qt.AlignHCenter
wrapMode: Text.WrapAnywhere
function showError() {
messageToast.show(qsTr("ERROR: $API_KEY is empty."));
apiKey.placeholderTextColor = theme.textErrorColor;
}
onTextChanged: {
apiKey.placeholderTextColor = theme.mutedTextColor;
}
placeholderText: qsTr("enter $API_KEY")
Accessible.role: Accessible.EditableText
Accessible.name: placeholderText
Accessible.description: qsTr("Whether the file hash is being calculated")
}
MyTextField {
id: baseUrl
visible: !installed && isOnline && isCompatibleApi
Layout.topMargin: 20
Layout.leftMargin: 20
Layout.minimumWidth: 200
Layout.alignment: Qt.AlignTop | Qt.AlignHCenter
wrapMode: Text.WrapAnywhere
function showError() {
messageToast.show(qsTr("ERROR: $BASE_URL is empty."));
baseUrl.placeholderTextColor = theme.textErrorColor;
}
onTextChanged: {
baseUrl.placeholderTextColor = theme.mutedTextColor;
}
placeholderText: qsTr("enter $BASE_URL")
Accessible.role: Accessible.EditableText
Accessible.name: placeholderText
Accessible.description: qsTr("Whether the file hash is being calculated")
}
MyTextField {
id: modelName
visible: !installed && isOnline && isCompatibleApi
Layout.topMargin: 20
Layout.leftMargin: 20
Layout.minimumWidth: 200
Layout.alignment: Qt.AlignTop | Qt.AlignHCenter
wrapMode: Text.WrapAnywhere
function showError() {
messageToast.show(qsTr("ERROR: $MODEL_NAME is empty."))
modelName.placeholderTextColor = theme.textErrorColor;
}
onTextChanged: {
modelName.placeholderTextColor = theme.mutedTextColor;
}
placeholderText: qsTr("enter $MODEL_NAME")
Accessible.role: Accessible.EditableText
Accessible.name: placeholderText
Accessible.description: qsTr("Whether the file hash is being calculated")
}
} }
} }
} }

View File

@ -89,6 +89,13 @@ Rectangle {
gpt4AllModelView.show(); gpt4AllModelView.show();
} }
} }
MyTabButton {
text: qsTr("Remote Providers")
isSelected: remoteModelView.isShown()
onPressed: {
remoteModelView.show();
}
}
MyTabButton { MyTabButton {
text: qsTr("HuggingFace") text: qsTr("HuggingFace")
isSelected: huggingfaceModelView.isShown() isSelected: huggingfaceModelView.isShown()
@ -112,7 +119,20 @@ Rectangle {
stackLayout.currentIndex = 0; stackLayout.currentIndex = 0;
} }
function isShown() { function isShown() {
return stackLayout.currentIndex === 0 return stackLayout.currentIndex === 0;
}
}
AddRemoteModelView {
id: remoteModelView
Layout.fillWidth: true
Layout.fillHeight: true
function show() {
stackLayout.currentIndex = 1;
}
function isShown() {
return stackLayout.currentIndex === 1;
} }
} }
@ -126,10 +146,10 @@ Rectangle {
anchors.fill: parent anchors.fill: parent
function show() { function show() {
stackLayout.currentIndex = 1; stackLayout.currentIndex = 2;
} }
function isShown() { function isShown() {
return stackLayout.currentIndex === 1 return stackLayout.currentIndex === 2;
} }
} }
} }

View File

@ -0,0 +1,147 @@
import QtCore
import QtQuick
import QtQuick.Controls
import QtQuick.Controls.Basic
import QtQuick.Layouts
import QtQuick.Dialogs
import Qt.labs.folderlistmodel
import Qt5Compat.GraphicalEffects
import llm
import chatlistmodel
import download
import modellist
import network
import gpt4all
import mysettings
import localdocs
ColumnLayout {
Layout.fillWidth: true
Layout.alignment: Qt.AlignTop
spacing: 5
Label {
Layout.topMargin: 0
Layout.bottomMargin: 25
Layout.rightMargin: 150 * theme.fontScale
Layout.alignment: Qt.AlignTop
Layout.fillWidth: true
verticalAlignment: Text.AlignTop
text: qsTr("Various remote model providers that use network resources for inference.")
font.pixelSize: theme.fontSizeLarger
color: theme.textColor
wrapMode: Text.WordWrap
}
ScrollView {
id: scrollView
ScrollBar.vertical.policy: ScrollBar.AsNeeded
Layout.fillWidth: true
Layout.fillHeight: true
contentWidth: availableWidth
clip: true
Flow {
anchors.left: parent.left
anchors.right: parent.right
spacing: 20
bottomPadding: 20
property int childWidth: 330 * theme.fontScale
property int childHeight: 400 + 166 * theme.fontScale
RemoteModelCard {
width: parent.childWidth
height: parent.childHeight
providerBaseUrl: "https://api.groq.com/openai/v1/"
providerName: qsTr("Groq")
providerImage: "qrc:/gpt4all/icons/groq.svg"
providerDesc: qsTr('Groq offers a high-performance AI inference engine designed for low-latency and efficient processing. Optimized for real-time applications, Groqs technology is ideal for users who need fast responses from open large language models and other AI workloads.<br><br>Get your API key: <a href="https://console.groq.com/keys">https://groq.com/</a>')
modelWhitelist: [
// last updated 2025-02-24
"deepseek-r1-distill-llama-70b",
"deepseek-r1-distill-qwen-32b",
"gemma2-9b-it",
"llama-3.1-8b-instant",
"llama-3.2-1b-preview",
"llama-3.2-3b-preview",
"llama-3.3-70b-specdec",
"llama-3.3-70b-versatile",
"llama3-70b-8192",
"llama3-8b-8192",
"mixtral-8x7b-32768",
"qwen-2.5-32b",
"qwen-2.5-coder-32b",
]
}
RemoteModelCard {
width: parent.childWidth
height: parent.childHeight
providerBaseUrl: "https://api.openai.com/v1/"
providerName: qsTr("OpenAI")
providerImage: "qrc:/gpt4all/icons/openai.svg"
providerDesc: qsTr('OpenAI provides access to advanced AI models, including GPT-4 supporting a wide range of applications, from conversational AI to content generation and code completion.<br><br>Get your API key: <a href="https://platform.openai.com/signup">https://openai.com/</a>')
modelWhitelist: [
// last updated 2025-02-24
"gpt-3.5-turbo",
"gpt-3.5-turbo-16k",
"gpt-4",
"gpt-4-32k",
"gpt-4-turbo",
"gpt-4o",
]
}
RemoteModelCard {
width: parent.childWidth
height: parent.childHeight
providerBaseUrl: "https://api.mistral.ai/v1/"
providerName: qsTr("Mistral")
providerImage: "qrc:/gpt4all/icons/mistral.svg"
providerDesc: qsTr('Mistral AI specializes in efficient, open-weight language models optimized for various natural language processing tasks. Their models are designed for flexibility and performance, making them a solid option for applications requiring scalable AI solutions.<br><br>Get your API key: <a href="https://mistral.ai/">https://mistral.ai/</a>')
modelWhitelist: [
// last updated 2025-02-24
"codestral-2405",
"codestral-2411-rc5",
"codestral-2412",
"codestral-2501",
"codestral-latest",
"codestral-mamba-2407",
"codestral-mamba-latest",
"ministral-3b-2410",
"ministral-3b-latest",
"ministral-8b-2410",
"ministral-8b-latest",
"mistral-large-2402",
"mistral-large-2407",
"mistral-large-2411",
"mistral-large-latest",
"mistral-medium-2312",
"mistral-medium-latest",
"mistral-saba-2502",
"mistral-saba-latest",
"mistral-small-2312",
"mistral-small-2402",
"mistral-small-2409",
"mistral-small-2501",
"mistral-small-latest",
"mistral-tiny-2312",
"mistral-tiny-2407",
"mistral-tiny-latest",
"open-codestral-mamba",
"open-mistral-7b",
"open-mistral-nemo",
"open-mistral-nemo-2407",
"open-mixtral-8x22b",
"open-mixtral-8x22b-2404",
"open-mixtral-8x7b",
]
}
RemoteModelCard {
width: parent.childWidth
height: parent.childHeight
providerIsCustom: true
providerName: qsTr("Custom")
providerImage: "qrc:/gpt4all/icons/antenna_3.svg"
providerDesc: qsTr("The custom provider option allows users to connect their own OpenAI-compatible AI models or third-party inference services. This is useful for organizations with proprietary models or those leveraging niche AI providers not listed here.")
}
}
}
}

View File

@ -198,6 +198,7 @@ GridLayout {
isError: false isError: false
isThinking: true isThinking: true
thinkingTime: modelData.thinkingTime thinkingTime: modelData.thinkingTime
visible: modelData.content !== ""
} }
} }
} }

View File

@ -60,27 +60,28 @@ ComboBox {
highlighted: comboBox.highlightedIndex === index highlighted: comboBox.highlightedIndex === index
} }
popup: Popup { popup: Popup {
// FIXME This should be made much nicer to take into account lists that are very long so
// that it is scrollable and also sized optimally taking into account the x,y and the content
// width and height as well as the window width and height
y: comboBox.height - 1 y: comboBox.height - 1
width: comboBox.width width: comboBox.width
implicitHeight: contentItem.implicitHeight + 20 implicitHeight: Math.min(window.height - y, contentItem.implicitHeight + 20)
padding: 0 padding: 0
contentItem: Rectangle { contentItem: Rectangle {
implicitWidth: myListView.contentWidth implicitWidth: comboBox.width
implicitHeight: myListView.contentHeight implicitHeight: myListView.contentHeight
color: "transparent" color: "transparent"
ListView { radius: 10
id: myListView ScrollView {
anchors.fill: parent anchors.fill: parent
anchors.margins: 10 anchors.margins: 10
clip: true clip: true
implicitHeight: contentHeight ScrollBar.vertical.policy: ScrollBar.AsNeeded
model: comboBox.popup.visible ? comboBox.delegateModel : null ScrollBar.horizontal.policy: ScrollBar.AlwaysOff
currentIndex: comboBox.highlightedIndex ListView {
ScrollIndicator.vertical: ScrollIndicator { } id: myListView
implicitHeight: contentHeight
model: comboBox.popup.visible ? comboBox.delegateModel : null
currentIndex: comboBox.highlightedIndex
ScrollIndicator.vertical: ScrollIndicator { }
}
} }
} }

View File

@ -0,0 +1,221 @@
import QtCore
import QtQuick
import QtQuick.Controls
import QtQuick.Controls.Basic
import QtQuick.Layouts
import QtQuick.Dialogs
import Qt.labs.folderlistmodel
import Qt5Compat.GraphicalEffects
import llm
import chatlistmodel
import download
import modellist
import network
import gpt4all
import mysettings
import localdocs
Rectangle {
property alias providerName: providerNameLabel.text
property alias providerImage: myimage.source
property alias providerDesc: providerDescLabel.text
property string providerBaseUrl: ""
property bool providerIsCustom: false
property var modelWhitelist: null
color: theme.conversationBackground
radius: 10
border.width: 1
border.color: theme.controlBorder
implicitHeight: topColumn.height + bottomColumn.height + 33 * theme.fontScale
ColumnLayout {
id: topColumn
anchors.left: parent.left
anchors.right: parent.right
anchors.top: parent.top
anchors.margins: 20
spacing: 15 * theme.fontScale
RowLayout {
Layout.alignment: Qt.AlignTop
spacing: 10
Item {
Layout.preferredWidth: 27 * theme.fontScale
Layout.preferredHeight: 27 * theme.fontScale
Layout.alignment: Qt.AlignLeft
Image {
id: myimage
anchors.centerIn: parent
sourceSize.width: parent.width
sourceSize.height: parent.height
mipmap: true
fillMode: Image.PreserveAspectFit
}
}
Label {
id: providerNameLabel
color: theme.textColor
font.pixelSize: theme.fontSizeBanner
}
}
Label {
id: providerDescLabel
Layout.fillWidth: true
wrapMode: Text.Wrap
color: theme.settingsTitleTextColor
font.pixelSize: theme.fontSizeLarge
onLinkActivated: function(link) { Qt.openUrlExternally(link); }
MouseArea {
anchors.fill: parent
acceptedButtons: Qt.NoButton // pass clicks to parent
cursorShape: parent.hoveredLink ? Qt.PointingHandCursor : Qt.ArrowCursor
}
}
}
ColumnLayout {
id: bottomColumn
anchors.left: parent.left
anchors.right: parent.right
anchors.bottom: parent.bottom
anchors.margins: 20
spacing: 30
ColumnLayout {
MySettingsLabel {
text: qsTr("API Key")
font.bold: true
font.pixelSize: theme.fontSizeLarge
color: theme.settingsTitleTextColor
}
MyTextField {
id: apiKeyField
Layout.fillWidth: true
font.pixelSize: theme.fontSizeLarge
wrapMode: Text.WrapAnywhere
function showError() {
messageToast.show(qsTr("ERROR: $API_KEY is empty."));
apiKeyField.placeholderTextColor = theme.textErrorColor;
}
onTextChanged: {
apiKeyField.placeholderTextColor = theme.mutedTextColor;
if (!providerIsCustom) {
let models = ModelList.remoteModelList(apiKeyField.text, providerBaseUrl);
if (modelWhitelist !== null)
models = models.filter(m => modelWhitelist.includes(m));
myModelList.model = models;
myModelList.currentIndex = -1;
}
}
placeholderText: qsTr("enter $API_KEY")
Accessible.role: Accessible.EditableText
Accessible.name: placeholderText
Accessible.description: qsTr("Whether the file hash is being calculated")
}
}
ColumnLayout {
visible: providerIsCustom
MySettingsLabel {
text: qsTr("Base Url")
font.bold: true
font.pixelSize: theme.fontSizeLarge
color: theme.settingsTitleTextColor
}
MyTextField {
id: baseUrlField
Layout.fillWidth: true
font.pixelSize: theme.fontSizeLarge
wrapMode: Text.WrapAnywhere
function showError() {
messageToast.show(qsTr("ERROR: $BASE_URL is empty."));
baseUrlField.placeholderTextColor = theme.textErrorColor;
}
onTextChanged: {
baseUrlField.placeholderTextColor = theme.mutedTextColor;
}
placeholderText: qsTr("enter $BASE_URL")
Accessible.role: Accessible.EditableText
Accessible.name: placeholderText
}
}
ColumnLayout {
visible: providerIsCustom
MySettingsLabel {
text: qsTr("Model Name")
font.bold: true
font.pixelSize: theme.fontSizeLarge
color: theme.settingsTitleTextColor
}
MyTextField {
id: modelNameField
Layout.fillWidth: true
font.pixelSize: theme.fontSizeLarge
wrapMode: Text.WrapAnywhere
function showError() {
messageToast.show(qsTr("ERROR: $MODEL_NAME is empty."))
modelNameField.placeholderTextColor = theme.textErrorColor;
}
onTextChanged: {
modelNameField.placeholderTextColor = theme.mutedTextColor;
}
placeholderText: qsTr("enter $MODEL_NAME")
Accessible.role: Accessible.EditableText
Accessible.name: placeholderText
}
}
ColumnLayout {
visible: myModelList.count > 0 && !providerIsCustom
MySettingsLabel {
text: qsTr("Models")
font.bold: true
font.pixelSize: theme.fontSizeLarge
color: theme.settingsTitleTextColor
}
RowLayout {
spacing: 10
MyComboBox {
Layout.fillWidth: true
id: myModelList
currentIndex: -1;
}
}
}
MySettingsButton {
id: installButton
Layout.alignment: Qt.AlignRight
text: qsTr("Install")
font.pixelSize: theme.fontSizeLarge
property string apiKeyText: apiKeyField.text.trim()
property string baseUrlText: providerIsCustom ? baseUrlField.text.trim() : providerBaseUrl.trim()
property string modelNameText: providerIsCustom ? modelNameField.text.trim() : myModelList.currentText.trim()
enabled: apiKeyText !== "" && baseUrlText !== "" && modelNameText !== ""
onClicked: {
Download.installCompatibleModel(
modelNameText,
apiKeyText,
baseUrlText,
);
myModelList.currentIndex = -1;
}
Accessible.role: Accessible.Button
Accessible.name: qsTr("Install")
Accessible.description: qsTr("Install remote model")
}
}
}

View File

@ -7,24 +7,26 @@
#include "toolcallparser.h" #include "toolcallparser.h"
#include "toolmodel.h" #include "toolmodel.h"
#include <QBuffer> #include <QByteArray>
#include <QDataStream> #include <QDataStream>
#include <QDebug> #include <QDebug>
#include <QJsonDocument> #include <QFile>
#include <QJsonObject> #include <QFileInfo>
#include <QJsonValue> #include <QIODevice>
#include <QLatin1String> #include <QLatin1String>
#include <QMap> #include <QMap>
#include <QRegularExpression> #include <QRegularExpression>
#include <QString> #include <QString>
#include <QVariant>
#include <Qt> #include <Qt>
#include <QtAssert>
#include <QtLogging> #include <QtLogging>
#include <optional>
#include <utility> #include <utility>
using namespace ToolEnums; using namespace ToolEnums;
Chat::Chat(QObject *parent) Chat::Chat(QObject *parent)
: QObject(parent) : QObject(parent)
, m_id(Network::globalInstance()->generateUniqueId()) , m_id(Network::globalInstance()->generateUniqueId())

View File

@ -3,21 +3,26 @@
#include "chatllm.h" #include "chatllm.h"
#include "chatmodel.h" #include "chatmodel.h"
#include "database.h" // IWYU pragma: keep #include "database.h"
#include "localdocsmodel.h" // IWYU pragma: keep #include "localdocsmodel.h"
#include "modellist.h" #include "modellist.h"
#include "tool.h"
#include <QDateTime> #include <QDateTime>
#include <QList> #include <QList>
#include <QObject> #include <QObject>
#include <QQmlEngine> #include <QQmlEngine> // IWYU pragma: keep
#include <QString> #include <QString>
#include <QStringList> // IWYU pragma: keep #include <QStringList> // IWYU pragma: keep
#include <QStringView> #include <QUrl>
#include <QtGlobal> #include <QVariant>
#include <QtTypes>
// IWYU pragma: no_forward_declare LocalDocsCollectionsModel
// IWYU pragma: no_forward_declare ToolCallInfo
class QDataStream; class QDataStream;
class Chat : public QObject class Chat : public QObject
{ {
Q_OBJECT Q_OBJECT

View File

@ -2,6 +2,9 @@
#include "utils.h" #include "utils.h"
#include <fmt/format.h>
#include <QAnyStringView>
#include <QCoreApplication> #include <QCoreApplication>
#include <QDebug> #include <QDebug>
#include <QGuiApplication> #include <QGuiApplication>
@ -9,15 +12,17 @@
#include <QJsonDocument> #include <QJsonDocument>
#include <QJsonObject> #include <QJsonObject>
#include <QJsonValue> #include <QJsonValue>
#include <QLatin1String>
#include <QNetworkAccessManager> #include <QNetworkAccessManager>
#include <QNetworkRequest> #include <QNetworkRequest>
#include <QStringView>
#include <QThread> #include <QThread>
#include <QUrl> #include <QUrl>
#include <QUtf8StringView> #include <QUtf8StringView> // IWYU pragma: keep
#include <QVariant> #include <QVariant>
#include <QXmlStreamReader> #include <QXmlStreamReader>
#include <Qt> #include <Qt>
#include <QtGlobal> #include <QtAssert>
#include <QtLogging> #include <QtLogging>
#include <expected> #include <expected>
@ -29,6 +34,7 @@ using namespace Qt::Literals::StringLiterals;
//#define DEBUG //#define DEBUG
ChatAPI::ChatAPI() ChatAPI::ChatAPI()
: QObject(nullptr) : QObject(nullptr)
, m_modelName("gpt-3.5-turbo") , m_modelName("gpt-3.5-turbo")

View File

@ -3,10 +3,11 @@
#include <gpt4all-backend/llmodel.h> #include <gpt4all-backend/llmodel.h>
#include <QByteArray> // IWYU pragma: keep #include <QByteArray>
#include <QNetworkReply> #include <QNetworkReply>
#include <QObject> #include <QObject>
#include <QString> #include <QString>
#include <QtPreprocessorSupport>
#include <cstddef> #include <cstddef>
#include <cstdint> #include <cstdint>
@ -17,9 +18,11 @@
#include <unordered_map> #include <unordered_map>
#include <vector> #include <vector>
// IWYU pragma: no_forward_declare QByteArray
class ChatAPI;
class QNetworkAccessManager; class QNetworkAccessManager;
class ChatAPI;
class ChatAPIWorker : public QObject { class ChatAPIWorker : public QObject {
Q_OBJECT Q_OBJECT
public: public:

View File

@ -1,23 +1,24 @@
#include "chatlistmodel.h" #include "chatlistmodel.h"
#include "database.h" // IWYU pragma: keep
#include "mysettings.h" #include "mysettings.h"
#include <QCoreApplication>
#include <QDataStream> #include <QDataStream>
#include <QDir> #include <QDir>
#include <QElapsedTimer> #include <QElapsedTimer>
#include <QEvent>
#include <QFile> #include <QFile>
#include <QFileInfo> #include <QFileInfo>
#include <QGlobalStatic> #include <QGlobalStatic>
#include <QGuiApplication> #include <QGuiApplication>
#include <QIODevice> #include <QIODevice>
#include <QSettings> #include <QSettings>
#include <QString> #include <QStringList> // IWYU pragma: keep
#include <QStringList>
#include <Qt> #include <Qt>
#include <QtTypes>
#include <algorithm> #include <algorithm>
#include <memory>
static constexpr quint32 CHAT_FORMAT_MAGIC = 0xF5D553CC; static constexpr quint32 CHAT_FORMAT_MAGIC = 0xF5D553CC;
static constexpr qint32 CHAT_FORMAT_VERSION = 12; static constexpr qint32 CHAT_FORMAT_VERSION = 12;

View File

@ -7,17 +7,20 @@
#include <QAbstractListModel> #include <QAbstractListModel>
#include <QByteArray> #include <QByteArray>
#include <QDate>
#include <QDebug> #include <QDebug>
#include <QHash> #include <QHash>
#include <QList> #include <QList>
#include <QMutex> #include <QMutex>
#include <QObject> #include <QObject>
#include <QString>
#include <QThread> #include <QThread>
#include <QVariant> #include <QVariant>
#include <QVector> #include <QVector> // IWYU pragma: keep
#include <Qt> #include <Qt>
#include <QtGlobal> #include <QtAssert>
#include <QtLogging> #include <QtLogging>
#include <QtPreprocessorSupport>
#include <memory> #include <memory>

View File

@ -15,32 +15,40 @@
#include <minja/minja.hpp> #include <minja/minja.hpp>
#include <nlohmann/json.hpp> #include <nlohmann/json.hpp>
#include <QChar>
#include <QDataStream> #include <QDataStream>
#include <QDebug> #include <QDebug>
#include <QFile> #include <QFile>
#include <QGlobalStatic> #include <QGlobalStatic>
#include <QIODevice> #include <QIODevice> // IWYU pragma: keep
#include <QJsonDocument> #include <QJsonDocument>
#include <QJsonObject> #include <QJsonObject>
#include <QJsonValue> #include <QJsonValue>
#include <QMap> #include <QMap>
#include <QMutex> #include <QMutex> // IWYU pragma: keep
#include <QMutexLocker> // IWYU pragma: keep #include <QMutexLocker> // IWYU pragma: keep
#include <QRegularExpression> #include <QRegularExpression> // IWYU pragma: keep
#include <QRegularExpressionMatch> #include <QRegularExpressionMatch> // IWYU pragma: keep
#include <QSet> #include <QSet>
#include <QStringView>
#include <QTextStream> #include <QTextStream>
#include <QUrl> #include <QUrl>
#include <QVariant>
#include <QWaitCondition> #include <QWaitCondition>
#include <Qt> #include <Qt>
#include <QtAssert>
#include <QtLogging> #include <QtLogging>
#include <QtTypes> // IWYU pragma: keep
#include <algorithm> #include <algorithm>
#include <chrono> #include <chrono>
#include <cmath> #include <cmath>
#include <concepts>
#include <cstddef> #include <cstddef>
#include <cstdint>
#include <ctime> #include <ctime>
#include <exception> #include <exception>
#include <functional>
#include <iomanip> #include <iomanip>
#include <limits> #include <limits>
#include <optional> #include <optional>
@ -976,7 +984,8 @@ public:
{ {
Q_UNUSED(bufferIdx) Q_UNUSED(bufferIdx)
try { try {
m_cllm->m_chatModel->setResponseValue(response); QString r = response;
m_cllm->m_chatModel->setResponseValue(removeLeadingWhitespace(r));
} catch (const std::exception &e) { } catch (const std::exception &e) {
// We have a try/catch here because the main thread might have removed the response from // We have a try/catch here because the main thread might have removed the response from
// the chatmodel by erasing the conversation during the response... the main thread sets // the chatmodel by erasing the conversation during the response... the main thread sets
@ -991,7 +1000,7 @@ public:
bool onRegularResponse() override bool onRegularResponse() override
{ {
auto respStr = QString::fromUtf8(m_result->response); auto respStr = QString::fromUtf8(m_result->response);
return onBufferResponse(removeLeadingWhitespace(respStr), 0); return onBufferResponse(respStr, 0);
} }
bool getStopGenerating() const override bool getStopGenerating() const override
@ -1078,7 +1087,7 @@ auto ChatLLM::promptInternal(
auto respStr = QString::fromUtf8(result.response); auto respStr = QString::fromUtf8(result.response);
if (!respStr.isEmpty() && (std::as_const(respStr).back().isSpace() || finalBuffers.size() > 1)) { if (!respStr.isEmpty() && (std::as_const(respStr).back().isSpace() || finalBuffers.size() > 1)) {
if (finalBuffers.size() > 1) if (finalBuffers.size() > 1)
m_chatModel->setResponseValue(finalBuffers.last()); m_chatModel->setResponseValue(finalBuffers.last().trimmed());
else else
m_chatModel->setResponseValue(respStr.trimmed()); m_chatModel->setResponseValue(respStr.trimmed());
emit responseChanged(); emit responseChanged();

View File

@ -2,7 +2,7 @@
#define CHATLLM_H #define CHATLLM_H
#include "chatmodel.h" #include "chatmodel.h"
#include "database.h" // IWYU pragma: keep #include "database.h"
#include "modellist.h" #include "modellist.h"
#include <gpt4all-backend/llmodel.h> #include <gpt4all-backend/llmodel.h>
@ -10,29 +10,30 @@
#include <QByteArray> #include <QByteArray>
#include <QElapsedTimer> #include <QElapsedTimer>
#include <QFileInfo> #include <QFileInfo>
#include <QList> // IWYU pragma: keep #include <QList>
#include <QObject> #include <QObject>
#include <QPointer> #include <QPointer>
#include <QString> #include <QString>
#include <QStringList> // IWYU pragma: keep #include <QStringList> // IWYU pragma: keep
#include <QStringView>
#include <QThread> #include <QThread>
#include <QVariantMap> // IWYU pragma: keep #include <QVariantMap> // IWYU pragma: keep
#include <QtGlobal> #include <QtNumeric>
#include <atomic> #include <atomic>
#include <cstdint>
#include <memory> #include <memory>
#include <optional> #include <optional>
#include <span> #include <span>
#include <string> #include <string>
#include <string_view>
#include <variant> #include <variant>
#include <vector>
using namespace Qt::Literals::StringLiterals; using namespace Qt::Literals::StringLiterals;
class ChatViewResponseHandler; class ChatLLM;
class QDataStream; class QDataStream;
// NOTE: values serialized to disk, do not change or reuse // NOTE: values serialized to disk, do not change or reuse
enum class LLModelTypeV0 { // chat versions 2-5 enum class LLModelTypeV0 { // chat versions 2-5
MPT = 0, MPT = 0,
@ -89,9 +90,6 @@ inline LLModelTypeV1 parseLLModelTypeV0(int v0)
} }
} }
class ChatLLM;
class ChatModel;
struct LLModelInfo { struct LLModelInfo {
std::unique_ptr<LLModel> model; std::unique_ptr<LLModel> model;
QFileInfo fileInfo; QFileInfo fileInfo;

View File

@ -2,9 +2,11 @@
#include <QDebug> #include <QDebug>
#include <QMap> #include <QMap>
#include <QtGlobal> #include <QTextStream>
#include <QtLogging> #include <QtLogging>
#include <exception>
QList<ResultInfo> ChatItem::consolidateSources(const QList<ResultInfo> &sources) QList<ResultInfo> ChatItem::consolidateSources(const QList<ResultInfo> &sources)
{ {

View File

@ -4,32 +4,41 @@
#include "database.h" #include "database.h"
#include "tool.h" #include "tool.h"
#include "toolcallparser.h" #include "toolcallparser.h"
#include "utils.h" #include "utils.h" // IWYU pragma: keep
#include "xlsxtomd.h" #include "xlsxtomd.h"
#include <fmt/format.h> #include <fmt/format.h>
#include <QApplication>
#include <QAbstractListModel> #include <QAbstractListModel>
#include <QBuffer> #include <QBuffer>
#include <QByteArray> #include <QByteArray>
#include <QClipboard> #include <QClipboard>
#include <QDataStream> #include <QDataStream>
#include <QJsonDocument> #include <QFileInfo>
#include <QGuiApplication>
#include <QIODevice>
#include <QHash> #include <QHash>
#include <QList> #include <QList>
#include <QMutex>
#include <QMutexLocker> // IWYU pragma: keep
#include <QObject> #include <QObject>
#include <QPair> #include <QPair> // IWYU pragma: keep
#include <QString> #include <QString>
#include <QStringList> // IWYU pragma: keep
#include <QUrl>
#include <QVariant> #include <QVariant>
#include <QVector>
#include <Qt> #include <Qt>
#include <QtGlobal> #include <QtAssert>
#include <QtPreprocessorSupport>
#include <QtTypes>
#include <algorithm> #include <algorithm>
#include <iterator> #include <iterator>
#include <list>
#include <optional>
#include <ranges> #include <ranges>
#include <span> #include <span>
#include <stdexcept>
#include <utility> #include <utility>
#include <vector> #include <vector>
@ -204,7 +213,8 @@ public:
: QObject(nullptr) : QObject(nullptr)
{ {
moveToThread(parent->thread()); moveToThread(parent->thread());
setParent(parent); // setParent must be called from the thread the object lives in
QMetaObject::invokeMethod(this, [this, parent]() { this->setParent(parent); });
} }
// NOTE: System messages are currently never serialized and only *stored* by the local server. // NOTE: System messages are currently never serialized and only *stored* by the local server.

View File

@ -1,29 +1,32 @@
#include "chatviewtextprocessor.h" #include "chatviewtextprocessor.h"
#include <QAbstractTextDocumentLayout>
#include <QBrush> #include <QBrush>
#include <QChar> #include <QChar>
#include <QClipboard> #include <QClipboard>
#include <QDebug>
#include <QFlag>
#include <QFont> #include <QFont>
#include <QFontMetricsF>
#include <QGuiApplication> #include <QGuiApplication>
#include <QList> #include <QList> // IWYU pragma: keep
#include <QPainter> #include <QPair>
#include <QQuickTextDocument> #include <QQuickTextDocument>
#include <QRegularExpression> #include <QRegularExpression>
#include <QStringList> #include <QStringList> // IWYU pragma: keep
#include <QTextBlock> #include <QTextBlock> // IWYU pragma: keep
#include <QTextCharFormat> #include <QTextCharFormat> // IWYU pragma: keep
#include <QTextCursor> #include <QTextCursor>
#include <QTextDocument> #include <QTextDocument>
#include <QTextDocumentFragment> #include <QTextDocumentFragment>
#include <QTextFrame> #include <QTextFrame> // IWYU pragma: keep
#include <QTextFrameFormat> #include <QTextFrameFormat> // IWYU pragma: keep
#include <QTextTableCell> #include <QTextTableCell>
#include <QVariant> #include <QtAssert>
#include <Qt> #include <QtLogging>
#include <QtGlobal>
#include <algorithm> #include <algorithm>
#include <utility>
enum Language { enum Language {
None, None,

View File

@ -3,18 +3,15 @@
#include <QColor> #include <QColor>
#include <QObject> #include <QObject>
#include <QQmlEngine> #include <QQmlEngine> // IWYU pragma: keep
#include <QQuickTextDocument> // IWYU pragma: keep #include <QQuickTextDocument>
#include <QRectF>
#include <QSizeF>
#include <QString> #include <QString>
#include <QSyntaxHighlighter> #include <QSyntaxHighlighter>
#include <QTextObjectInterface> #include <QVector> // IWYU pragma: keep
#include <QVector> #include <QtTypes>
// IWYU pragma: no_forward_declare QQuickTextDocument
class QPainter;
class QTextDocument;
class QTextFormat;
struct CodeColors { struct CodeColors {
Q_GADGET Q_GADGET

View File

@ -1,12 +1,16 @@
#include "codeinterpreter.h" #include "codeinterpreter.h"
#include <QJSEngine>
#include <QJSValue> #include <QJSValue>
#include <QStringList> #include <QList>
#include <QStringList> // IWYU pragma: keep
#include <QThread> #include <QThread>
#include <QVariant> #include <QVariant>
#include <Qt>
using namespace Qt::Literals::StringLiterals; using namespace Qt::Literals::StringLiterals;
CodeInterpreter::CodeInterpreter() CodeInterpreter::CodeInterpreter()
: Tool() : Tool()
, m_error(ToolEnums::Error::NoError) , m_error(ToolEnums::Error::NoError)

View File

@ -4,11 +4,13 @@
#include "tool.h" #include "tool.h"
#include "toolcallparser.h" #include "toolcallparser.h"
#include <QJSEngine>
#include <QObject> #include <QObject>
#include <QString> #include <QString>
#include <QtGlobal>
#include <QThread> #include <QThread>
#include <QtAssert>
class QJSEngine;
class JavaScriptConsoleCapture : public QObject class JavaScriptConsoleCapture : public QObject
{ {

View File

@ -1,19 +1,21 @@
#include "database.h" #include "database.h"
#include "mysettings.h" #include "mysettings.h"
#include "utils.h" #include "utils.h" // IWYU pragma: keep
#include <duckx/duckx.hpp> #include <duckx/duckx.hpp>
#include <fmt/format.h> #include <fmt/format.h>
#include <usearch/index.hpp>
#include <usearch/index_plugins.hpp> #include <usearch/index_plugins.hpp>
#include <QByteArrayView>
#include <QDebug> #include <QDebug>
#include <QDir> #include <QDir>
#include <QDirIterator> #include <QDirIterator>
#include <QFile> #include <QFile>
#include <QFileSystemWatcher> #include <QFileSystemWatcher>
#include <QFlags>
#include <QIODevice> #include <QIODevice>
#include <QKeyValueIterator>
#include <QRegularExpression> #include <QRegularExpression>
#include <QSqlError> #include <QSqlError>
#include <QSqlQuery> #include <QSqlQuery>
@ -22,8 +24,9 @@
#include <QMap> #include <QMap>
#include <QUtf8StringView> #include <QUtf8StringView>
#include <QVariant> #include <QVariant>
#include <Qt>
#include <QtLogging> #include <QtLogging>
#include <QtMinMax>
#include <QtTypes>
#include <algorithm> #include <algorithm>
#include <cmath> #include <cmath>
@ -46,6 +49,7 @@ namespace us = unum::usearch;
//#define DEBUG //#define DEBUG
//#define DEBUG_EXAMPLE //#define DEBUG_EXAMPLE
namespace { namespace {
/* QFile that checks input for binary data. If seen, it fails the read and returns true /* QFile that checks input for binary data. If seen, it fails the read and returns true
@ -1111,9 +1115,9 @@ class DocumentReader {
public: public:
struct Metadata { QString title, author, subject, keywords; }; struct Metadata { QString title, author, subject, keywords; };
static std::unique_ptr<DocumentReader> fromDocument(const DocumentInfo &info); static std::unique_ptr<DocumentReader> fromDocument(DocumentInfo info);
const DocumentInfo &doc () const { return *m_info; } const DocumentInfo &doc () const { return m_info; }
const Metadata &metadata() const { return m_metadata; } const Metadata &metadata() const { return m_metadata; }
const std::optional<QString> &word () const { return m_word; } const std::optional<QString> &word () const { return m_word; }
const std::optional<QString> &nextWord() { m_word = advance(); return m_word; } const std::optional<QString> &nextWord() { m_word = advance(); return m_word; }
@ -1123,8 +1127,8 @@ public:
virtual ~DocumentReader() = default; virtual ~DocumentReader() = default;
protected: protected:
explicit DocumentReader(const DocumentInfo &info) explicit DocumentReader(DocumentInfo info)
: m_info(&info) {} : m_info(std::move(info)) {}
void postInit(Metadata &&metadata = {}) void postInit(Metadata &&metadata = {})
{ {
@ -1134,9 +1138,9 @@ protected:
virtual std::optional<QString> advance() = 0; virtual std::optional<QString> advance() = 0;
const DocumentInfo *m_info; DocumentInfo m_info;
Metadata m_metadata; Metadata m_metadata;
std::optional<QString> m_word; std::optional<QString> m_word;
}; };
namespace { namespace {
@ -1144,8 +1148,8 @@ namespace {
#ifdef GPT4ALL_USE_QTPDF #ifdef GPT4ALL_USE_QTPDF
class PdfDocumentReader final : public DocumentReader { class PdfDocumentReader final : public DocumentReader {
public: public:
explicit PdfDocumentReader(const DocumentInfo &info) explicit PdfDocumentReader(DocumentInfo info)
: DocumentReader(info) : DocumentReader(std::move(info))
{ {
QString path = info.file.canonicalFilePath(); QString path = info.file.canonicalFilePath();
if (m_doc.load(path) != QPdfDocument::Error::None) if (m_doc.load(path) != QPdfDocument::Error::None)
@ -1185,8 +1189,8 @@ private:
#else #else
class PdfDocumentReader final : public DocumentReader { class PdfDocumentReader final : public DocumentReader {
public: public:
explicit PdfDocumentReader(const DocumentInfo &info) explicit PdfDocumentReader(DocumentInfo info)
: DocumentReader(info) : DocumentReader(std::move(info))
{ {
QString path = info.file.canonicalFilePath(); QString path = info.file.canonicalFilePath();
m_doc = FPDF_LoadDocument(path.toUtf8().constData(), nullptr); m_doc = FPDF_LoadDocument(path.toUtf8().constData(), nullptr);
@ -1277,8 +1281,8 @@ private:
class WordDocumentReader final : public DocumentReader { class WordDocumentReader final : public DocumentReader {
public: public:
explicit WordDocumentReader(const DocumentInfo &info) explicit WordDocumentReader(DocumentInfo info)
: DocumentReader(info) : DocumentReader(std::move(info))
, m_doc(info.file.canonicalFilePath().toStdString()) , m_doc(info.file.canonicalFilePath().toStdString())
{ {
m_doc.open(); m_doc.open();
@ -1370,8 +1374,8 @@ protected:
class TxtDocumentReader final : public DocumentReader { class TxtDocumentReader final : public DocumentReader {
public: public:
explicit TxtDocumentReader(const DocumentInfo &info) explicit TxtDocumentReader(DocumentInfo info)
: DocumentReader(info) : DocumentReader(std::move(info))
, m_file(info.file.canonicalFilePath()) , m_file(info.file.canonicalFilePath())
{ {
if (!m_file.open(QIODevice::ReadOnly)) if (!m_file.open(QIODevice::ReadOnly))
@ -1412,13 +1416,13 @@ protected:
} // namespace } // namespace
std::unique_ptr<DocumentReader> DocumentReader::fromDocument(const DocumentInfo &doc) std::unique_ptr<DocumentReader> DocumentReader::fromDocument(DocumentInfo doc)
{ {
if (doc.isPdf()) if (doc.isPdf())
return std::make_unique<PdfDocumentReader>(doc); return std::make_unique<PdfDocumentReader>(std::move(doc));
if (doc.isDocx()) if (doc.isDocx())
return std::make_unique<WordDocumentReader>(doc); return std::make_unique<WordDocumentReader>(std::move(doc));
return std::make_unique<TxtDocumentReader>(doc); return std::make_unique<TxtDocumentReader>(std::move(doc));
} }
ChunkStreamer::ChunkStreamer(Database *database) ChunkStreamer::ChunkStreamer(Database *database)
@ -1426,12 +1430,12 @@ ChunkStreamer::ChunkStreamer(Database *database)
ChunkStreamer::~ChunkStreamer() = default; ChunkStreamer::~ChunkStreamer() = default;
void ChunkStreamer::setDocument(const DocumentInfo &doc, int documentId, const QString &embeddingModel) void ChunkStreamer::setDocument(DocumentInfo doc, int documentId, const QString &embeddingModel)
{ {
auto docKey = doc.key(); auto docKey = doc.key();
if (!m_docKey || *m_docKey != docKey) { if (!m_docKey || *m_docKey != docKey) {
m_docKey = docKey; m_docKey = docKey;
m_reader = DocumentReader::fromDocument(doc); m_reader = DocumentReader::fromDocument(std::move(doc));
m_documentId = documentId; m_documentId = documentId;
m_embeddingModel = embeddingModel; m_embeddingModel = embeddingModel;
m_chunk.clear(); m_chunk.clear();
@ -1441,7 +1445,8 @@ void ChunkStreamer::setDocument(const DocumentInfo &doc, int documentId, const Q
if (m_database->m_documentIdCache.contains(documentId)) { if (m_database->m_documentIdCache.contains(documentId)) {
QSqlQuery q(m_database->m_db); QSqlQuery q(m_database->m_db);
if (!m_database->removeChunksByDocumentId(q, documentId)) if (!m_database->removeChunksByDocumentId(q, documentId))
handleDocumentError("ERROR: Cannot remove chunks of document", documentId, doc.file.canonicalPath(), q.lastError()); handleDocumentError("ERROR: Cannot remove chunks of document",
documentId, m_reader->doc().file.canonicalPath(), q.lastError());
} }
} }
} }

View File

@ -1,7 +1,7 @@
#ifndef DATABASE_H #ifndef DATABASE_H
#define DATABASE_H #define DATABASE_H
#include "embllm.h" // IWYU pragma: keep #include "embllm.h"
#include <QByteArray> #include <QByteArray>
#include <QChar> #include <QChar>
@ -15,11 +15,11 @@
#include <QSet> #include <QSet>
#include <QSqlDatabase> #include <QSqlDatabase>
#include <QString> #include <QString>
#include <QStringList> #include <QStringList> // IWYU pragma: keep
#include <QThread> #include <QThread>
#include <QUrl> #include <QUrl>
#include <QVector> #include <QVector> // IWYU pragma: keep
#include <QtGlobal> #include <QtAssert>
#include <atomic> #include <atomic>
#include <cstddef> #include <cstddef>
@ -28,7 +28,7 @@
#include <memory> #include <memory>
#include <optional> #include <optional>
#include <utility> #include <utility>
#include <vector> #include <vector> // IWYU pragma: keep
using namespace Qt::Literals::StringLiterals; using namespace Qt::Literals::StringLiterals;
@ -39,6 +39,7 @@ class QSqlQuery;
class QTextStream; class QTextStream;
class QTimer; class QTimer;
/* Version 0: GPT4All v2.4.3, full-text search /* Version 0: GPT4All v2.4.3, full-text search
* Version 1: GPT4All v2.5.3, embeddings in hsnwlib * Version 1: GPT4All v2.5.3, embeddings in hsnwlib
* Version 2: GPT4All v3.0.0, embeddings in sqlite * Version 2: GPT4All v3.0.0, embeddings in sqlite
@ -171,7 +172,7 @@ public:
explicit ChunkStreamer(Database *database); explicit ChunkStreamer(Database *database);
~ChunkStreamer(); ~ChunkStreamer();
void setDocument(const DocumentInfo &doc, int documentId, const QString &embeddingModel); void setDocument(DocumentInfo doc, int documentId, const QString &embeddingModel);
std::optional<DocumentInfo::key_type> currentDocKey() const; std::optional<DocumentInfo::key_type> currentDocKey() const;
void reset(); void reset();

View File

@ -10,32 +10,37 @@
#include <QDebug> #include <QDebug>
#include <QGlobalStatic> #include <QGlobalStatic>
#include <QGuiApplication> #include <QGuiApplication>
#include <QIODevice> #include <QIODevice> // IWYU pragma: keep
#include <QJsonArray> #include <QJsonArray>
#include <QJsonDocument> #include <QJsonDocument>
#include <QJsonObject> #include <QJsonObject>
#include <QJsonValue> #include <QJsonValue>
#include <QKeyValueIterator>
#include <QLocale> #include <QLocale>
#include <QNetworkRequest> #include <QNetworkRequest>
#include <QPair> #include <QPair> // IWYU pragma: keep
#include <QRegularExpression>
#include <QRegularExpressionMatch>
#include <QSettings> #include <QSettings>
#include <QSslConfiguration> #include <QSslConfiguration>
#include <QSslSocket> #include <QSslSocket>
#include <QStringList> #include <QStringList> // IWYU pragma: keep
#include <QTextStream> #include <QTextStream>
#include <QUrl> #include <QUrl>
#include <QVariant> #include <QVariant>
#include <QVector> #include <QVector> // IWYU pragma: keep
#include <Qt> #include <Qt>
#include <QtAssert>
#include <QtLogging> #include <QtLogging>
#include <QtMinMax>
#include <algorithm>
#include <compare> #include <compare>
#include <cstddef> #include <cstddef>
#include <utility> #include <utility>
using namespace Qt::Literals::StringLiterals; using namespace Qt::Literals::StringLiterals;
class MyDownload: public Download { }; class MyDownload: public Download { };
Q_GLOBAL_STATIC(MyDownload, downloadInstance) Q_GLOBAL_STATIC(MyDownload, downloadInstance)
Download *Download::globalInstance() Download *Download::globalInstance()

View File

@ -13,10 +13,14 @@
#include <QSslError> #include <QSslError>
#include <QString> #include <QString>
#include <QThread> #include <QThread>
#include <QtGlobal> #include <QtTypes>
// IWYU pragma: no_forward_declare QFile
// IWYU pragma: no_forward_declare QList
// IWYU pragma: no_forward_declare QSslError
class QByteArray; class QByteArray;
struct ReleaseInfo { struct ReleaseInfo {
Q_GADGET Q_GADGET
Q_PROPERTY(QString version MEMBER version) Q_PROPERTY(QString version MEMBER version)

View File

@ -1,35 +1,35 @@
#include "embllm.h" #include "embllm.h"
#include "modellist.h"
#include "mysettings.h" #include "mysettings.h"
#include <gpt4all-backend/llmodel.h> #include <gpt4all-backend/llmodel.h>
#include <QCoreApplication> #include <QCoreApplication>
#include <QDebug> #include <QDebug>
#include <QFile>
#include <QFileInfo> #include <QFileInfo>
#include <QGuiApplication> #include <QGuiApplication>
#include <QIODevice>
#include <QJsonArray> #include <QJsonArray>
#include <QJsonDocument> #include <QJsonDocument>
#include <QJsonObject> #include <QJsonObject>
#include <QJsonValue>
#include <QList> #include <QList>
#include <QMutexLocker> #include <QMutexLocker> // IWYU pragma: keep
#include <QNetworkAccessManager> #include <QNetworkAccessManager>
#include <QNetworkReply> #include <QNetworkReply>
#include <QNetworkRequest> #include <QNetworkRequest>
#include <QUrl> #include <QUrl>
#include <Qt> #include <Qt>
#include <QtGlobal> #include <QtAssert>
#include <QtLogging> #include <QtLogging>
#include <exception> #include <exception>
#include <string>
#include <utility> #include <utility>
#include <vector> #include <vector>
using namespace Qt::Literals::StringLiterals; using namespace Qt::Literals::StringLiterals;
static const QString EMBEDDING_MODEL_NAME = u"nomic-embed-text-v1.5"_s; static const QString EMBEDDING_MODEL_NAME = u"nomic-embed-text-v1.5"_s;
static const QString LOCAL_EMBEDDING_MODEL = u"nomic-embed-text-v1.5.f16.gguf"_s; static const QString LOCAL_EMBEDDING_MODEL = u"nomic-embed-text-v1.5.f16.gguf"_s;
@ -359,8 +359,11 @@ void EmbeddingLLMWorker::handleFinished()
if (retrievedData.isValid() && retrievedData.canConvert<QVector<EmbeddingChunk>>()) if (retrievedData.isValid() && retrievedData.canConvert<QVector<EmbeddingChunk>>())
chunks = retrievedData.value<QVector<EmbeddingChunk>>(); chunks = retrievedData.value<QVector<EmbeddingChunk>>();
QVariant response = reply->attribute(QNetworkRequest::HttpStatusCodeAttribute); QVariant response;
Q_ASSERT(response.isValid()); if (reply->error() != QNetworkReply::NoError) {
response = reply->attribute(QNetworkRequest::HttpStatusCodeAttribute);
Q_ASSERT(response.isValid());
}
bool ok; bool ok;
int code = response.toInt(&ok); int code = response.toInt(&ok);
if (!ok || code != 200) { if (!ok || code != 200) {

View File

@ -5,10 +5,10 @@
#include <QMutex> #include <QMutex>
#include <QObject> #include <QObject>
#include <QString> #include <QString>
#include <QStringList> #include <QStringList> // IWYU pragma: keep
#include <QThread> #include <QThread>
#include <QVariant> #include <QVariant>
#include <QVector> #include <QVector> // IWYU pragma: keep
#include <atomic> #include <atomic>
#include <vector> #include <vector>
@ -16,6 +16,7 @@
class LLModel; class LLModel;
class QNetworkAccessManager; class QNetworkAccessManager;
struct EmbeddingChunk { struct EmbeddingChunk {
QString model; // TODO(jared): use to select model QString model; // TODO(jared): use to select model
int folder_id; int folder_id;

View File

@ -1,16 +1,11 @@
#include "jinja_helpers.h" #include "jinja_helpers.h"
#include "utils.h"
#include <fmt/format.h>
#include <QString> #include <QString>
#include <QUrl> #include <QUrl>
#include <iterator>
#include <map>
#include <ranges> #include <ranges>
#include <vector> #include <string>
#include <utility>
namespace views = std::views; namespace views = std::views;
using json = nlohmann::ordered_json; using json = nlohmann::ordered_json;

View File

@ -5,7 +5,11 @@
#include <nlohmann/json.hpp> #include <nlohmann/json.hpp>
#include <QtGlobal> #include <QtTypes> // IWYU pragma: keep
// IWYU pragma: no_forward_declare MessageItem
// IWYU pragma: no_forward_declare PromptAttachment
// IWYU pragma: no_forward_declare ResultInfo
using json = nlohmann::ordered_json; using json = nlohmann::ordered_json;

View File

@ -2,6 +2,9 @@
#include "jinja_replacements.h" #include "jinja_replacements.h"
#include <utility>
// This is a list of prompt templates known to GPT4All and their associated replacements which are automatically used // This is a list of prompt templates known to GPT4All and their associated replacements which are automatically used
// instead when loading the chat template from GGUF. These exist for two primary reasons: // instead when loading the chat template from GGUF. These exist for two primary reasons:
// - HuggingFace model authors make ugly chat templates because they do not expect the end user to see them; // - HuggingFace model authors make ugly chat templates because they do not expect the end user to see them;
@ -113,11 +116,31 @@ const std::unordered_map<std::string_view, std::string_view> CHAT_TEMPLATE_SUBST
{%- elif message['role'] == 'system' %} {%- elif message['role'] == 'system' %}
{{- '<|system|>\n' + message['content'] + eos_token }} {{- '<|system|>\n' + message['content'] + eos_token }}
{%- elif message['role'] == 'assistant' %} {%- elif message['role'] == 'assistant' %}
{{- '<|assistant|>\n' + message['content'] + eos_token }} {{- '<|assistant|>\n' + message['content'] + eos_token }}
{%- endif %} {%- endif %}
{%- if loop.last and add_generation_prompt %} {%- if loop.last and add_generation_prompt %}
{{- '<|assistant|>' }} {{- '<|assistant|>' }}
{%- endif %} {%- endif %}
{%- endfor %})TEMPLATE",
},
// granite-3.1-3b-a800m-instruct-Q4_0.gguf, granite-3.1-8b-instruct-Q4_0.gguf (nomic-ai/gpt4all#3471)
{
// original
R"TEMPLATE({%- if messages[0]['role'] == 'system' %}{%- set system_message = messages[0]['content'] %}{%- set loop_messages = messages[1:] %}{%- else %}{%- set system_message = "Knowledge Cutoff Date: April 2024. You are Granite, developed by IBM." %}{%- if tools and documents %}{%- set system_message = system_message + " You are a helpful AI assistant with access to the following tools. When a tool is required to answer the user's query, respond with <|tool_call|> followed by a JSON list of tools used. If a tool does not exist in the provided list of tools, notify the user that you do not have the ability to fulfill the request. Write the response to the user's input by strictly aligning with the facts in the provided documents. If the information needed to answer the question is not available in the documents, inform the user that the question cannot be answered based on the available data." %}{%- elif tools %}{%- set system_message = system_message + " You are a helpful AI assistant with access to the following tools. When a tool is required to answer the user's query, respond with <|tool_call|> followed by a JSON list of tools used. If a tool does not exist in the provided list of tools, notify the user that you do not have the ability to fulfill the request." %}{%- elif documents %}{%- set system_message = system_message + " Write the response to the user's input by strictly aligning with the facts in the provided documents. If the information needed to answer the question is not available in the documents, inform the user that the question cannot be answered based on the available data." %}{%- else %}{%- set system_message = system_message + " You are a helpful AI assistant." %}{%- endif %}{%- if controls and 'citations' in controls and documents %}{%- set system_message = system_message + ' In your response, use the symbols <co> and </co> to indicate when a fact comes from a document in the search result, e.g <co>0</co> for a fact from document 0. Afterwards, list all the citations with their corresponding documents in an ordered list.' %}{%- endif %}{%- if controls and 'hallucinations' in controls and documents %}{%- set system_message = system_message + ' Finally, after the response is written, include a numbered list of sentences from the response that are potentially hallucinated and not based in the documents.' %}{%- endif %}{%- set loop_messages = messages %}{%- endif %}{{- '<|start_of_role|>system<|end_of_role|>' + system_message + '<|end_of_text|> ' }}{%- if tools %}{{- '<|start_of_role|>tools<|end_of_role|>' }}{{- tools | tojson(indent=4) }}{{- '<|end_of_text|> ' }}{%- endif %}{%- if documents %}{{- '<|start_of_role|>documents<|end_of_role|>' }}{%- for document in documents %}{{- 'Document ' + loop.index0 | string + ' ' }}{{- document['text'] }}{%- if not loop.last %}{{- ' '}}{%- endif%}{%- endfor %}{{- '<|end_of_text|> ' }}{%- endif %}{%- for message in loop_messages %}{{- '<|start_of_role|>' + message['role'] + '<|end_of_role|>' + message['content'] + '<|end_of_text|> ' }}{%- if loop.last and add_generation_prompt %}{{- '<|start_of_role|>assistant' }}{%- if controls %}{{- ' ' + controls | tojson()}}{%- endif %}{{- '<|end_of_role|>' }}{%- endif %}{%- endfor %})TEMPLATE",
// replacement
R"TEMPLATE({%- if messages[0]['role'] == 'system' %}
{%- set system_message = messages[0]['content'] %}
{%- set loop_messages = messages[1:] %}
{%- else %}
{%- set system_message = "Knowledge Cutoff Date: April 2024. You are Granite, developed by IBM. You are a helpful AI assistant." %}
{%- set loop_messages = messages %}
{%- endif %}
{{- '<|start_of_role|>system<|end_of_role|>' + system_message + '<|end_of_text|> ' }}
{%- for message in loop_messages %}
{{- '<|start_of_role|>' + message['role'] + '<|end_of_role|>' + message['content'] + '<|end_of_text|> ' }}
{%- if loop.last and add_generation_prompt %}
{{- '<|start_of_role|>assistant<|end_of_role|>' }}
{%- endif %}
{%- endfor %})TEMPLATE", {%- endfor %})TEMPLATE",
}, },
// Hermes-3-Llama-3.2-3B.Q4_0.gguf, mistral-7b-openorca.gguf2.Q4_0.gguf // Hermes-3-Llama-3.2-3B.Q4_0.gguf, mistral-7b-openorca.gguf2.Q4_0.gguf
@ -618,6 +641,70 @@ const std::unordered_map<std::string_view, std::string_view> CHAT_TEMPLATE_SUBST
{%- if add_generation_prompt %} {%- if add_generation_prompt %}
{{- '<|im_start|>assistant\n' }} {{- '<|im_start|>assistant\n' }}
{%- endif %})TEMPLATE", {%- endif %})TEMPLATE",
},
// OLMoE-1B-7B-0125-Instruct-Q4_0.gguf (nomic-ai/gpt4all#3471)
{
// original
R"TEMPLATE({{ bos_token }}{% for message in messages %}{% if message['role'] == 'system' %}{{ '<|system|>
' + message['content'] + '
' }}{% elif message['role'] == 'user' %}{{ '<|user|>
' + message['content'] + '
' }}{% elif message['role'] == 'assistant' %}{% if not loop.last %}{{ '<|assistant|>
' + message['content'] + eos_token + '
' }}{% else %}{{ '<|assistant|>
' + message['content'] + eos_token }}{% endif %}{% endif %}{% if loop.last and add_generation_prompt %}{{ '<|assistant|>
' }}{% endif %}{% endfor %})TEMPLATE",
// replacement
R"TEMPLATE({{- bos_token }}
{%- for message in messages %}
{%- if message['role'] == 'system' %}
{{- '<|system|>\n' + message['content'] + '\n' }}
{%- elif message['role'] == 'user' %}
{{- '<|user|>\n' + message['content'] + '\n' }}
{%- elif message['role'] == 'assistant' %}
{%- if not loop.last %}
{{- '<|assistant|>\n' + message['content'] + eos_token + '\n' }}
{%- else %}
{{- '<|assistant|>\n' + message['content'] + eos_token }}
{%- endif %}
{%- endif %}
{%- if loop.last and add_generation_prompt %}
{{- '<|assistant|>\n' }}
{%- endif %}
{%- endfor %})TEMPLATE",
},
// OLMoE-1B-7B-0924-Instruct-Q4_0.gguf (nomic-ai/gpt4all#3471)
{
// original
R"TEMPLATE({{ bos_token }}{% for message in messages %}
{% if message['role'] == 'system' %}
{{ '<|system|>
' + message['content'] }}
{% elif message['role'] == 'user' %}
{{ '<|user|>
' + message['content'] }}
{% elif message['role'] == 'assistant' %}
{{ '<|assistant|>
' + message['content'] + eos_token }}
{% endif %}
{% if loop.last and add_generation_prompt %}
{{ '<|assistant|>' }}
{% endif %}
{% endfor %})TEMPLATE",
// replacement
R"TEMPLATE({{- bos_token }}
{%- for message in messages %}
{%- if message['role'] == 'system' %}
{{- '<|system|>\n' + message['content'] }}
{%- elif message['role'] == 'user' %}
{{- '<|user|>\n' + message['content'] }}
{%- elif message['role'] == 'assistant' %}
{{- '<|assistant|>\n' + message['content'] + eos_token }}
{%- endif %}
{%- if loop.last and add_generation_prompt %}
{{- '<|assistant|>' }}
{%- endif %}
{%- endfor %})TEMPLATE",
}, },
// Phi-3.1-mini-128k-instruct-Q4_0.gguf (nomic-ai/gpt4all#3346) // Phi-3.1-mini-128k-instruct-Q4_0.gguf (nomic-ai/gpt4all#3346)
{ {

View File

@ -12,6 +12,9 @@
#include <QSettings> #include <QSettings>
#include <QUrl> #include <QUrl>
#include <QtLogging> #include <QtLogging>
#include <QtSystemDetection>
#include <string>
#ifdef GPT4ALL_OFFLINE_INSTALLER #ifdef GPT4ALL_OFFLINE_INSTALLER
# include <QDesktopServices> # include <QDesktopServices>
@ -25,6 +28,7 @@
using namespace Qt::Literals::StringLiterals; using namespace Qt::Literals::StringLiterals;
class MyLLM: public LLM { }; class MyLLM: public LLM { };
Q_GLOBAL_STATIC(MyLLM, llmInstance) Q_GLOBAL_STATIC(MyLLM, llmInstance)
LLM *LLM::globalInstance() LLM *LLM::globalInstance()

View File

@ -3,7 +3,8 @@
#include <QObject> #include <QObject>
#include <QString> #include <QString>
#include <QtGlobal> #include <QtTypes>
class LLM : public QObject class LLM : public QObject
{ {

View File

@ -5,10 +5,14 @@
#include "mysettings.h" #include "mysettings.h"
#include <QCoreApplication> #include <QCoreApplication>
#include <QDebug>
#include <QGlobalStatic> #include <QGlobalStatic>
#include <QGuiApplication> #include <QGuiApplication>
#include <QList>
#include <QUrl> #include <QUrl>
#include <Qt> #include <Qt>
#include <QtLogging>
class MyLocalDocs: public LocalDocs { }; class MyLocalDocs: public LocalDocs { };
Q_GLOBAL_STATIC(MyLocalDocs, localDocsInstance) Q_GLOBAL_STATIC(MyLocalDocs, localDocsInstance)

View File

@ -2,11 +2,14 @@
#define LOCALDOCS_H #define LOCALDOCS_H
#include "database.h" #include "database.h"
#include "localdocsmodel.h" // IWYU pragma: keep #include "localdocsmodel.h"
#include <QObject> #include <QObject>
#include <QString> #include <QString>
#include <QStringList> #include <QStringList> // IWYU pragma: keep
// IWYU pragma: no_forward_declare LocalDocsModel
class LocalDocs : public QObject class LocalDocs : public QObject
{ {

View File

@ -5,11 +5,11 @@
#include <QDateTime> #include <QDateTime>
#include <QMap> #include <QMap>
#include <QVector> #include <QVector> // IWYU pragma: keep
#include <QtGlobal>
#include <utility> #include <utility>
LocalDocsCollectionsModel::LocalDocsCollectionsModel(QObject *parent) LocalDocsCollectionsModel::LocalDocsCollectionsModel(QObject *parent)
: QSortFilterProxyModel(parent) : QSortFilterProxyModel(parent)
{ {

View File

@ -4,17 +4,19 @@
#include "database.h" #include "database.h"
#include <QAbstractListModel> #include <QAbstractListModel>
#include <QByteArray>
#include <QHash>
#include <QList> #include <QList>
#include <QObject> #include <QObject> // IWYU pragma: keep
#include <QSortFilterProxyModel> #include <QSortFilterProxyModel>
#include <QString> #include <QString>
#include <QVariant>
#include <Qt> #include <Qt>
#include <functional> #include <functional>
class QByteArray;
class QVariant;
template <typename Key, typename T> class QHash;
class LocalDocsCollectionsModel : public QSortFilterProxyModel class LocalDocsCollectionsModel : public QSortFilterProxyModel
{ {
Q_OBJECT Q_OBJECT

View File

@ -2,8 +2,10 @@
#include <QDateTime> #include <QDateTime>
#include <QDebug> #include <QDebug>
#include <QFlags>
#include <QGlobalStatic> #include <QGlobalStatic>
#include <QIODevice> #include <QIODevice>
#include <QMutexLocker> // IWYU pragma: keep
#include <QStandardPaths> #include <QStandardPaths>
#include <cstdio> #include <cstdio>
@ -12,6 +14,7 @@
using namespace Qt::Literals::StringLiterals; using namespace Qt::Literals::StringLiterals;
class MyLogger: public Logger { }; class MyLogger: public Logger { };
Q_GLOBAL_STATIC(MyLogger, loggerInstance) Q_GLOBAL_STATIC(MyLogger, loggerInstance)
Logger *Logger::globalInstance() Logger *Logger::globalInstance()
@ -62,8 +65,11 @@ void Logger::messageHandler(QtMsgType type, const QMessageLogContext &, const QS
} }
// Get time and date // Get time and date
auto timestamp = QDateTime::currentDateTime().toString(); auto timestamp = QDateTime::currentDateTime().toString();
// Write message
const std::string out = u"[%1] (%2): %3\n"_s.arg(typeString, timestamp, msg).toStdString(); const std::string out = u"[%1] (%2): %3\n"_s.arg(typeString, timestamp, msg).toStdString();
// Write message
QMutexLocker locker(&logger->m_mutex);
logger->m_file.write(out.c_str()); logger->m_file.write(out.c_str());
logger->m_file.flush(); logger->m_file.flush();
std::cerr << out; std::cerr << out;

View File

@ -2,19 +2,24 @@
#define LOGGER_H #define LOGGER_H
#include <QFile> #include <QFile>
#include <QMutex>
#include <QString> #include <QString>
#include <QtLogging> #include <QtLogging>
class Logger
{
QFile m_file;
static void messageHandler(QtMsgType type, const QMessageLogContext &context, const QString &msg);
class Logger {
public: public:
explicit Logger();
static Logger *globalInstance(); static Logger *globalInstance();
explicit Logger(); private:
static void messageHandler(QtMsgType type, const QMessageLogContext &context, const QString &msg);
private:
QFile m_file;
QMutex m_mutex;
friend class MyLogger; friend class MyLogger;
}; };

View File

@ -2,6 +2,7 @@
#include <Cocoa/Cocoa.h> #include <Cocoa/Cocoa.h>
void MacOSDock::showIcon() void MacOSDock::showIcon()
{ {
[[NSApplication sharedApplication] setActivationPolicy:NSApplicationActivationPolicyRegular]; [[NSApplication sharedApplication] setActivationPolicy:NSApplicationActivationPolicyRegular];

View File

@ -12,18 +12,24 @@
#include <gpt4all-backend/llmodel.h> #include <gpt4all-backend/llmodel.h>
#include <singleapplication.h> #include <singleapplication.h>
#include <QByteArray>
#include <QCoreApplication> #include <QCoreApplication>
#include <QFont> #include <QFont>
#include <QFontDatabase> #include <QFontDatabase>
#include <QList>
#include <QObject> #include <QObject>
#include <QQmlApplicationEngine> #include <QQmlApplicationEngine>
#include <QQmlContext> #include <QQmlContext>
#include <QQuickWindow> #include <QQuickWindow>
#include <QSettings> #include <QSettings>
#include <QString> #include <QString>
#include <QStringList>
#include <QUrl> #include <QUrl>
#include <QVariant> #include <QVariant>
#include <QWindow>
#include <Qt> #include <Qt>
#include <QtAssert>
#include <QtSystemDetection>
#if G4A_CONFIG(force_d3d12) #if G4A_CONFIG(force_d3d12)
# include <QSGRendererInterface> # include <QSGRendererInterface>

View File

@ -9,9 +9,11 @@
#include <QChar> #include <QChar>
#include <QCoreApplication> #include <QCoreApplication>
#include <QCryptographicHash>
#include <QDebug> #include <QDebug>
#include <QDir> #include <QDir>
#include <QDirIterator> #include <QDirIterator>
#include <QEvent>
#include <QEventLoop> #include <QEventLoop>
#include <QFile> #include <QFile>
#include <QFileInfo> #include <QFileInfo>
@ -29,14 +31,15 @@
#include <QSslConfiguration> #include <QSslConfiguration>
#include <QSslSocket> #include <QSslSocket>
#include <QStandardPaths> #include <QStandardPaths>
#include <QStringList> #include <QStringList> // IWYU pragma: keep
#include <QTextStream> #include <QTextStream>
#include <QTimer> #include <QTimer>
#include <QUrl> #include <QUrl>
#include <QtAssert>
#include <QtLogging> #include <QtLogging>
#include <QtPreprocessorSupport>
#include <algorithm> #include <algorithm>
#include <cstddef>
#include <iterator> #include <iterator>
#include <optional> #include <optional>
#include <string> #include <string>
@ -499,10 +502,11 @@ bool GPT4AllDownloadableModels::filterAcceptsRow(int sourceRow,
bool hasDescription = !description.isEmpty(); bool hasDescription = !description.isEmpty();
bool isClone = sourceModel()->data(index, ModelList::IsCloneRole).toBool(); bool isClone = sourceModel()->data(index, ModelList::IsCloneRole).toBool();
bool isDiscovered = sourceModel()->data(index, ModelList::IsDiscoveredRole).toBool(); bool isDiscovered = sourceModel()->data(index, ModelList::IsDiscoveredRole).toBool();
bool isOnline = sourceModel()->data(index, ModelList::OnlineRole).toBool();
bool satisfiesKeyword = m_keywords.isEmpty(); bool satisfiesKeyword = m_keywords.isEmpty();
for (const QString &k : m_keywords) for (const QString &k : m_keywords)
satisfiesKeyword = description.contains(k) ? true : satisfiesKeyword; satisfiesKeyword = description.contains(k) ? true : satisfiesKeyword;
return !isDiscovered && hasDescription && !isClone && satisfiesKeyword; return !isOnline && !isDiscovered && hasDescription && !isClone && satisfiesKeyword;
} }
int GPT4AllDownloadableModels::count() const int GPT4AllDownloadableModels::count() const
@ -2353,3 +2357,56 @@ void ModelList::handleDiscoveryItemErrorOccurred(QNetworkReply::NetworkError cod
qWarning() << u"ERROR: Discovery item failed with error code \"%1-%2\""_s qWarning() << u"ERROR: Discovery item failed with error code \"%1-%2\""_s
.arg(code).arg(reply->errorString()).toStdString(); .arg(code).arg(reply->errorString()).toStdString();
} }
QStringList ModelList::remoteModelList(const QString &apiKey, const QUrl &baseUrl)
{
QStringList modelList;
// Create the request
QNetworkRequest request;
request.setUrl(baseUrl.resolved(QUrl("models")));
request.setHeader(QNetworkRequest::ContentTypeHeader, "application/json");
// Add the Authorization header
const QString bearerToken = QString("Bearer %1").arg(apiKey);
request.setRawHeader("Authorization", bearerToken.toUtf8());
// Make the GET request
QNetworkReply *reply = m_networkManager.get(request);
// We use a local event loop to wait for the request to complete
QEventLoop loop;
connect(reply, &QNetworkReply::finished, &loop, &QEventLoop::quit);
loop.exec();
// Check for errors
if (reply->error() == QNetworkReply::NoError) {
// Parse the JSON response
const QByteArray responseData = reply->readAll();
const QJsonDocument jsonDoc = QJsonDocument::fromJson(responseData);
if (!jsonDoc.isNull() && jsonDoc.isObject()) {
QJsonObject rootObj = jsonDoc.object();
QJsonValue dataValue = rootObj.value("data");
if (dataValue.isArray()) {
QJsonArray dataArray = dataValue.toArray();
for (const QJsonValue &val : dataArray) {
if (val.isObject()) {
QJsonObject obj = val.toObject();
const QString modelId = obj.value("id").toString();
modelList.append(modelId);
}
}
}
}
} else {
// Handle network error (e.g. print it to qDebug)
qWarning() << "Error retrieving models:" << reply->errorString();
}
// Clean up
reply->deleteLater();
return modelList;
}

View File

@ -5,25 +5,29 @@
#include <QByteArray> #include <QByteArray>
#include <QDateTime> #include <QDateTime>
#include <QHash> #include <QHash>
#include <QLatin1StringView> #include <QLatin1StringView> // IWYU pragma: keep
#include <QList> #include <QList>
#include <QMutex> #include <QMutex>
#include <QNetworkAccessManager> #include <QNetworkAccessManager>
#include <QNetworkReply> #include <QNetworkReply>
#include <QObject> #include <QObject>
#include <QPair> #include <QPair> // IWYU pragma: keep
#include <QQmlEngine> #include <QQmlEngine> // IWYU pragma: keep
#include <QSortFilterProxyModel> #include <QSortFilterProxyModel>
#include <QSslError> #include <QSslError>
#include <QString> #include <QString>
#include <QVariant> #include <QVariant>
#include <QVector> #include <QVector> // IWYU pragma: keep
#include <Qt> #include <Qt>
#include <QtGlobal> #include <QtTypes>
#include <optional> #include <optional>
#include <utility> #include <utility>
// IWYU pragma: no_forward_declare QObject
// IWYU pragma: no_forward_declare QSslError
class QUrl;
using namespace Qt::Literals::StringLiterals; using namespace Qt::Literals::StringLiterals;
@ -530,6 +534,8 @@ public:
Q_INVOKABLE void discoverSearch(const QString &discover); Q_INVOKABLE void discoverSearch(const QString &discover);
Q_INVOKABLE QStringList remoteModelList(const QString &apiKey, const QUrl &baseUrl);
Q_SIGNALS: Q_SIGNALS:
void countChanged(); void countChanged();
void installedModelsChanged(); void installedModelsChanged();

View File

@ -11,22 +11,27 @@
#include <QFileInfo> #include <QFileInfo>
#include <QGlobalStatic> #include <QGlobalStatic>
#include <QGuiApplication> #include <QGuiApplication>
#include <QIODevice> #include <QIODevice> // IWYU pragma: keep
#include <QMap> #include <QMap>
#include <QMetaObject> #include <QMetaObject>
#include <QStandardPaths> #include <QStandardPaths>
#include <QThread> #include <QThread>
#include <QUrl> #include <QUrl>
#include <QVariant>
#include <QtLogging> #include <QtLogging>
#include <QtAssert>
#include <algorithm> #include <algorithm>
#include <string> #include <string>
#include <thread> #include <thread>
#include <vector> #include <vector>
#if !(defined(Q_OS_MAC) && defined(__aarch64__))
#include <cstring>
#endif
using namespace Qt::Literals::StringLiterals; using namespace Qt::Literals::StringLiterals;
// used only for settings serialization, do not translate // used only for settings serialization, do not translate
static const QStringList suggestionModeNames { "LocalDocsOnly", "On", "Off" }; static const QStringList suggestionModeNames { "LocalDocsOnly", "On", "Off" };
static const QStringList chatThemeNames { "Light", "Dark", "LegacyDark" }; static const QStringList chatThemeNames { "Light", "Dark", "LegacyDark" };

View File

@ -4,20 +4,24 @@
#include "modellist.h" // IWYU pragma: keep #include "modellist.h" // IWYU pragma: keep
#include <QDateTime> #include <QDateTime>
#include <QLatin1StringView> #include <QLatin1StringView> // IWYU pragma: keep
#include <QList> #include <QList>
#include <QModelIndex> #include <QModelIndex>
#include <QObject> #include <QObject>
#include <QSettings> #include <QSettings>
#include <QString> #include <QString>
#include <QStringList> #include <QStringList> // IWYU pragma: keep
#include <QTranslator> #include <QTranslator>
#include <QVector> #include <QVariant>
#include <cstdint> #include <cstdint>
#include <memory> #include <memory>
#include <optional> #include <optional>
// IWYU pragma: no_forward_declare QModelIndex
class QLocale;
namespace MySettingsEnums { namespace MySettingsEnums {
Q_NAMESPACE Q_NAMESPACE

View File

@ -8,7 +8,6 @@
#include "localdocsmodel.h" #include "localdocsmodel.h"
#include "modellist.h" #include "modellist.h"
#include "mysettings.h" #include "mysettings.h"
#include "utils.h"
#include <gpt4all-backend/llmodel.h> #include <gpt4all-backend/llmodel.h>
@ -29,7 +28,6 @@
#include <QSslSocket> #include <QSslSocket>
#include <QSysInfo> #include <QSysInfo>
#include <Qt> #include <Qt>
#include <QtGlobal>
#include <QtLogging> #include <QtLogging>
#include <QUrl> #include <QUrl>
#include <QUuid> #include <QUuid>
@ -49,6 +47,7 @@ using namespace Qt::Literals::StringLiterals;
#define STR_(x) #x #define STR_(x) #x
#define STR(x) STR_(x) #define STR(x) STR_(x)
static const char MIXPANEL_TOKEN[] = "ce362e568ddaee16ed243eaffb5860a2"; static const char MIXPANEL_TOKEN[] = "ce362e568ddaee16ed243eaffb5860a2";
#ifdef __clang__ #ifdef __clang__
@ -242,6 +241,12 @@ void Network::handleJsonUploadFinished()
m_activeUploads.removeAll(jsonReply); m_activeUploads.removeAll(jsonReply);
if (jsonReply->error() != QNetworkReply::NoError) {
qWarning() << "Request to" << jsonReply->url().toString() << "failed:" << jsonReply->errorString();
jsonReply->deleteLater();
return;
}
QVariant response = jsonReply->attribute(QNetworkRequest::HttpStatusCodeAttribute); QVariant response = jsonReply->attribute(QNetworkRequest::HttpStatusCodeAttribute);
Q_ASSERT(response.isValid()); Q_ASSERT(response.isValid());
bool ok; bool ok;
@ -449,6 +454,11 @@ void Network::handleIpifyFinished()
QNetworkReply *reply = qobject_cast<QNetworkReply *>(sender()); QNetworkReply *reply = qobject_cast<QNetworkReply *>(sender());
if (!reply) if (!reply)
return; return;
if (reply->error() != QNetworkReply::NoError) {
qWarning() << "Request to" << reply->url().toString() << "failed:" << reply->errorString();
reply->deleteLater();
return;
}
QVariant response = reply->attribute(QNetworkRequest::HttpStatusCodeAttribute); QVariant response = reply->attribute(QNetworkRequest::HttpStatusCodeAttribute);
Q_ASSERT(response.isValid()); Q_ASSERT(response.isValid());
@ -473,6 +483,11 @@ void Network::handleMixpanelFinished()
QNetworkReply *reply = qobject_cast<QNetworkReply *>(sender()); QNetworkReply *reply = qobject_cast<QNetworkReply *>(sender());
if (!reply) if (!reply)
return; return;
if (reply->error() != QNetworkReply::NoError) {
qWarning() << "Request to" << reply->url().toString() << "failed:" << reply->errorString();
reply->deleteLater();
return;
}
QVariant response = reply->attribute(QNetworkRequest::HttpStatusCodeAttribute); QVariant response = reply->attribute(QNetworkRequest::HttpStatusCodeAttribute);
Q_ASSERT(response.isValid()); Q_ASSERT(response.isValid());
@ -511,6 +526,11 @@ void Network::handleHealthFinished()
QNetworkReply *healthReply = qobject_cast<QNetworkReply *>(sender()); QNetworkReply *healthReply = qobject_cast<QNetworkReply *>(sender());
if (!healthReply) if (!healthReply)
return; return;
if (healthReply->error() != QNetworkReply::NoError) {
qWarning() << "Request to" << healthReply->url().toString() << "failed:" << healthReply->errorString();
healthReply->deleteLater();
return;
}
QVariant response = healthReply->attribute(QNetworkRequest::HttpStatusCodeAttribute); QVariant response = healthReply->attribute(QNetworkRequest::HttpStatusCodeAttribute);
Q_ASSERT(response.isValid()); Q_ASSERT(response.isValid());

View File

@ -11,7 +11,14 @@
#include <QSslError> #include <QSslError>
#include <QString> #include <QString>
#include <QVariant> #include <QVariant>
#include <QVector> #include <QVariantMap> // IWYU pragma: keep
#include <QVector> // IWYU pragma: keep
// IWYU pragma: no_forward_declare QByteArray
// IWYU pragma: no_forward_declare QNetworkReply
// IWYU pragma: no_forward_declare QSslError
class QUrl;
struct KeyValue { struct KeyValue {
QString key; QString key;

View File

@ -4,9 +4,10 @@
#include "chatmodel.h" #include "chatmodel.h"
#include "modellist.h" #include "modellist.h"
#include "mysettings.h" #include "mysettings.h"
#include "utils.h" #include "utils.h" // IWYU pragma: keep
#include <fmt/format.h> #include <fmt/format.h>
#include <gpt4all-backend/llmodel.h>
#include <QByteArray> #include <QByteArray>
#include <QCborArray> #include <QCborArray>
@ -15,32 +16,38 @@
#include <QDateTime> #include <QDateTime>
#include <QDebug> #include <QDebug>
#include <QHostAddress> #include <QHostAddress>
#include <QHttpHeaders>
#include <QHttpServer> #include <QHttpServer>
#include <QHttpServerRequest>
#include <QHttpServerResponder> #include <QHttpServerResponder>
#include <QJsonArray> #include <QJsonArray>
#include <QJsonDocument> #include <QJsonDocument>
#include <QJsonObject> #include <QJsonObject>
#include <QJsonValue> #include <QJsonValue>
#include <QLatin1StringView> #include <QLatin1StringView>
#include <QPair> #include <QPair> // IWYU pragma: keep
#include <QTcpServer>
#include <QVariant> #include <QVariant>
#include <Qt> #include <Qt>
#include <QtAssert>
#include <QtCborCommon> #include <QtCborCommon>
#include <QtGlobal>
#include <QtLogging> #include <QtLogging>
#include <QtMinMax>
#include <QtPreprocessorSupport>
#include <QtTypes>
#include <cstdint> #include <cstdint>
#include <exception>
#include <iostream> #include <iostream>
#include <optional> #include <optional>
#include <span>
#include <stdexcept> #include <stdexcept>
#include <string> #include <string>
#include <type_traits> #include <string_view>
#include <unordered_map> #include <unordered_map>
#include <utility> #include <utility>
#include <variant>
#if QT_VERSION >= QT_VERSION_CHECK(6, 8, 0) #include <vector>
# include <QTcpServer>
#endif
using namespace std::string_literals; using namespace std::string_literals;
using namespace Qt::Literals::StringLiterals; using namespace Qt::Literals::StringLiterals;
@ -451,23 +458,17 @@ static QJsonObject requestFromJson(const QByteArray &request)
void Server::start() void Server::start()
{ {
m_server = std::make_unique<QHttpServer>(this); m_server = std::make_unique<QHttpServer>(this);
#if QT_VERSION >= QT_VERSION_CHECK(6, 8, 0)
auto *tcpServer = new QTcpServer(m_server.get()); auto *tcpServer = new QTcpServer(m_server.get());
#else
auto *tcpServer = m_server.get();
#endif
auto port = MySettings::globalInstance()->networkPort(); auto port = MySettings::globalInstance()->networkPort();
if (!tcpServer->listen(QHostAddress::LocalHost, port)) { if (!tcpServer->listen(QHostAddress::LocalHost, port)) {
qWarning() << "Server ERROR: Failed to listen on port" << port; qWarning() << "Server ERROR: Failed to listen on port" << port;
return; return;
} }
#if QT_VERSION >= QT_VERSION_CHECK(6, 8, 0)
if (!m_server->bind(tcpServer)) { if (!m_server->bind(tcpServer)) {
qWarning() << "Server ERROR: Failed to HTTP server to socket" << port; qWarning() << "Server ERROR: Failed to HTTP server to socket" << port;
return; return;
} }
#endif
m_server->route("/v1/models", QHttpServerRequest::Method::Get, m_server->route("/v1/models", QHttpServerRequest::Method::Get,
[](const QHttpServerRequest &) { [](const QHttpServerRequest &) {
@ -607,19 +608,12 @@ void Server::start()
} }
); );
#if QT_VERSION >= QT_VERSION_CHECK(6, 8, 0)
m_server->addAfterRequestHandler(this, [](const QHttpServerRequest &req, QHttpServerResponse &resp) { m_server->addAfterRequestHandler(this, [](const QHttpServerRequest &req, QHttpServerResponse &resp) {
Q_UNUSED(req); Q_UNUSED(req);
auto headers = resp.headers(); auto headers = resp.headers();
headers.append("Access-Control-Allow-Origin"_L1, "*"_L1); headers.append("Access-Control-Allow-Origin"_L1, "*"_L1);
resp.setHeaders(std::move(headers)); resp.setHeaders(std::move(headers));
}); });
#else
m_server->afterRequest([](QHttpServerResponse &&resp) {
resp.addHeader("Access-Control-Allow-Origin", "*");
return std::move(resp);
});
#endif
connect(this, &Server::requestResetResponseState, m_chat, &Chat::resetResponseState, Qt::BlockingQueuedConnection); connect(this, &Server::requestResetResponseState, m_chat, &Chat::resetResponseState, Qt::BlockingQueuedConnection);
} }

View File

@ -8,7 +8,7 @@
#include <QHttpServerResponse> #include <QHttpServerResponse>
#include <QJsonObject> #include <QJsonObject>
#include <QList> #include <QList>
#include <QObject> #include <QObject> // IWYU pragma: keep
#include <QString> #include <QString>
#include <memory> #include <memory>

View File

@ -1,5 +1,8 @@
#include "tool.h" #include "tool.h"
#include <QDataStream>
#include <QtTypes>
#include <string> #include <string>
using json = nlohmann::ordered_json; using json = nlohmann::ordered_json;

View File

@ -9,6 +9,8 @@
#include <QVariant> #include <QVariant>
#include <QtGlobal> #include <QtGlobal>
class QDataStream;
using json = nlohmann::ordered_json; using json = nlohmann::ordered_json;

View File

@ -2,8 +2,10 @@
#include "tool.h" #include "tool.h"
#include <QChar>
#include <QSet> #include <QSet>
#include <QtGlobal> #include <QtAssert>
#include <QtTypes>
#include <stdexcept> #include <stdexcept>

View File

@ -4,7 +4,7 @@
#include <QByteArray> #include <QByteArray>
#include <QList> #include <QList>
#include <QString> #include <QString>
#include <QStringList> #include <QStringList> // IWYU pragma: keep
namespace ToolEnums { enum class ParseState; } namespace ToolEnums { enum class ParseState; }

View File

@ -6,6 +6,7 @@
#include <QEvent> #include <QEvent>
#include <QGlobalStatic> #include <QGlobalStatic>
class MyToolModel: public ToolModel { }; class MyToolModel: public ToolModel { };
Q_GLOBAL_STATIC(MyToolModel, toolModelInstance) Q_GLOBAL_STATIC(MyToolModel, toolModelInstance)
ToolModel *ToolModel::globalInstance() ToolModel *ToolModel::globalInstance()

View File

@ -9,7 +9,8 @@
#include <QList> #include <QList>
#include <QString> #include <QString>
#include <QVariant> #include <QVariant>
#include <QtGlobal> #include <QtPreprocessorSupport>
class ToolModel : public QAbstractListModel class ToolModel : public QAbstractListModel
{ {

View File

@ -5,7 +5,7 @@
#include <QByteArray> #include <QByteArray>
#include <QJsonValue> #include <QJsonValue>
#include <QLatin1StringView> #include <QLatin1StringView> // IWYU pragma: keep
#include <QString> #include <QString>
#include <QStringView> #include <QStringView>
#include <QUtf8StringView> #include <QUtf8StringView>
@ -13,8 +13,9 @@
#include <initializer_list> #include <initializer_list>
#include <string_view> #include <string_view>
#include <utility> #include <utility> // IWYU pragma: keep
// IWYU pragma: no_forward_declare QJsonValue
class QJsonObject; class QJsonObject;
@ -40,4 +41,4 @@ MAKE_FORMATTER(QVariant, value.toString().toUtf8());
// alternative to QJsonObject's initializer_list constructor that accepts Latin-1 strings // alternative to QJsonObject's initializer_list constructor that accepts Latin-1 strings
QJsonObject makeJsonObject(std::initializer_list<std::pair<QLatin1StringView, QJsonValue>> args); QJsonObject makeJsonObject(std::initializer_list<std::pair<QLatin1StringView, QJsonValue>> args);
#include "utils.inl" #include "utils.inl" // IWYU pragma: export

View File

@ -1,5 +1,6 @@
#include <QJsonObject> #include <QJsonObject>
inline QJsonObject makeJsonObject(std::initializer_list<std::pair<QLatin1StringView, QJsonValue>> args) inline QJsonObject makeJsonObject(std::initializer_list<std::pair<QLatin1StringView, QJsonValue>> args)
{ {
QJsonObject obj; QJsonObject obj;

View File

@ -7,15 +7,16 @@
#include <xlsxformat.h> #include <xlsxformat.h>
#include <xlsxworksheet.h> #include <xlsxworksheet.h>
#include <QChar>
#include <QDateTime> #include <QDateTime>
#include <QDebug> #include <QDebug>
#include <QLatin1StringView>
#include <QList> #include <QList>
#include <QRegularExpression> #include <QRegularExpression>
#include <QString> #include <QString>
#include <QStringList> #include <QStringList> // IWYU pragma: keep
#include <QStringView> #include <QStringView>
#include <QVariant> #include <QVariant>
#include <QtGlobal>
#include <QtLogging> #include <QtLogging>
#include <memory> #include <memory>

View File

@ -4,6 +4,7 @@
class QIODevice; class QIODevice;
class QString; class QString;
class XLSXToMD class XLSXToMD
{ {
public: public:

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff