LocalAI running issue

For https://github.com/go-skynet/LocalAI/issues/771

unlisted ⁨1⁩ ⁨file⁩ 2023-08-11 15:41:12 UTC expires 2025-08-11 15:41:09 UTC

pastefile1

Raw
localai-api-1  | go mod edit -replace github.com/go-skynet/go-llama.cpp=/build/go-llama
localai-api-1  | go mod edit -replace github.com/nomic-ai/gpt4all/gpt4all-bindings/golang=/build/gpt4all/gpt4all-bindings/golang
localai-api-1  | go mod edit -replace github.com/go-skynet/go-ggml-transformers.cpp=/build/go-ggml-transformers
localai-api-1  | go mod edit -replace github.com/donomii/go-rwkv.cpp=/build/go-rwkv
localai-api-1  | go mod edit -replace github.com/ggerganov/whisper.cpp=/build/whisper.cpp
localai-api-1  | go mod edit -replace github.com/go-skynet/go-bert.cpp=/build/go-bert
localai-api-1  | go mod edit -replace github.com/go-skynet/bloomz.cpp=/build/bloomz
localai-api-1  | go mod edit -replace github.com/mudler/go-stable-diffusion=/build/go-stable-diffusion
localai-api-1  | go mod edit -replace github.com/mudler/go-piper=/build/go-piper
localai-api-1  | go mod edit -replace github.com/mudler/go-ggllm.cpp=/build/go-ggllm
localai-api-1  | go mod download
localai-api-1  | touch prepare
localai-api-1  | mkdir -p backend-assets/grpc
localai-api-1  | go build -ldflags "-X "github.com/go-skynet/LocalAI/internal.Version=v1.23.2" -X "github.com/go-skynet/LocalAI/internal.Commit=acd829a7a0e1623c0871c8b34c36c76afd4feac8"" -tags "" -o backend-assets/grpc/langchain-huggingface ./cmd/grpc/langchain-huggingface/
localai-api-1  | make -C go-ggml-transformers BUILD_TYPE= libtransformers.a
localai-api-1  | make[1]: Entering directory '/build/go-ggml-transformers'
localai-api-1  | I libtransformers build info: 
localai-api-1  | I UNAME_S:  Linux
localai-api-1  | I UNAME_P:  unknown
localai-api-1  | I UNAME_M:  x86_64
localai-api-1  | I CFLAGS:   -I. -I./ggml.cpp/include -I./ggml.cpp/include/ggml/ -I./ggml.cpp/examples/ -I -O3 -DNDEBUG -std=c11 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wdouble-promotion -Wshadow -Wstrict-prototypes -Wpointer-arith -Wno-unused-function -pthread -march=native -mtune=native
localai-api-1  | I CXXFLAGS: -I. -I./ggml.cpp/include -I./ggml.cpp/include/ggml/ -I./ggml.cpp/examples/ -O3 -DNDEBUG -std=c++17 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -march=native -mtune=native
localai-api-1  | I LDFLAGS:  
localai-api-1  | I CMAKE_ARGS:  
localai-api-1  | I CC:       cc (Debian 10.2.1-6) 10.2.1 20210110
localai-api-1  | I CXX:      g++ (Debian 10.2.1-6) 10.2.1 20210110
localai-api-1  | 
localai-api-1  | g++ -I. -I./ggml.cpp/include -I./ggml.cpp/include/ggml/ -I./ggml.cpp/examples/ -O3 -DNDEBUG -std=c++17 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -march=native -mtune=native starcoder.cpp -o starcoder.o -c 
localai-api-1  | In file included from starcoder.cpp:19:
localai-api-1  | ggml.cpp/examples/starcoder/main.cpp: In function 'int main_starcoder(int, char**)':
localai-api-1  | ggml.cpp/examples/starcoder/main.cpp:799:23: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |   799 |     for (int i = 0; i < embd_inp.size(); i++) {
localai-api-1  |       |                     ~~^~~~~~~~~~~~~~~~~
localai-api-1  | ggml.cpp/examples/starcoder/main.cpp:821:33: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |   821 |     for (int i = embd.size(); i < embd_inp.size() + params.n_predict; i++) {
localai-api-1  |       |                               ~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
localai-api-1  | ggml.cpp/examples/starcoder/main.cpp:837:15: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |   837 |         if (i >= embd_inp.size()) {
localai-api-1  |       |             ~~^~~~~~~~~~~~~~~~~~
localai-api-1  | ggml.cpp/examples/starcoder/main.cpp:859:31: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |   859 |             for (int k = i; k < embd_inp.size(); k++) {
localai-api-1  |       |                             ~~^~~~~~~~~~~~~~~~~
localai-api-1  | ggml.cpp/examples/starcoder/main.cpp:861:33: warning: comparison of integer expressions of different signedness: 'std::vector<int>::size_type' {aka 'long unsigned int'} and 'int32_t' {aka 'int'} [-Wsign-compare]
localai-api-1  |   861 |                 if (embd.size() >= params.n_batch) {
localai-api-1  |       |                     ~~~~~~~~~~~~^~~~~~~~~~~~~~~~~
localai-api-1  | starcoder.cpp: In function 'int starcoder_predict(void*, void*, char*)':
localai-api-1  | starcoder.cpp:80:33: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |    80 |     for (int i = embd.size(); i < embd_inp.size() + params.n_predict; i++) {
localai-api-1  |       |                               ~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
localai-api-1  | starcoder.cpp:96:15: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |    96 |         if (i >= embd_inp.size()) {
localai-api-1  |       |             ~~^~~~~~~~~~~~~~~~~~
localai-api-1  | starcoder.cpp:118:31: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |   118 |             for (int k = i; k < embd_inp.size(); k++) {
localai-api-1  |       |                             ~~^~~~~~~~~~~~~~~~~
localai-api-1  | starcoder.cpp:120:33: warning: comparison of integer expressions of different signedness: 'std::vector<int>::size_type' {aka 'long unsigned int'} and 'int32_t' {aka 'int'} [-Wsign-compare]
localai-api-1  |   120 |                 if (embd.size() >= params.n_batch) {
localai-api-1  |       |                     ~~~~~~~~~~~~^~~~~~~~~~~~~~~~~
localai-api-1  | starcoder.cpp:36:19: warning: unused variable 't_main_start_us' [-Wunused-variable]
localai-api-1  |    36 |     const int64_t t_main_start_us = ggml_time_us();
localai-api-1  |       |                   ^~~~~~~~~~~~~~~
localai-api-1  | starcoder.cpp:47:13: warning: unused variable 't_load_us' [-Wunused-variable]
localai-api-1  |    47 |     int64_t t_load_us = 0;
localai-api-1  |       |             ^~~~~~~~~
localai-api-1  | g++ -I. -I./ggml.cpp/include -I./ggml.cpp/include/ggml/ -I./ggml.cpp/examples/ -O3 -DNDEBUG -std=c++17 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -march=native -mtune=native falcon.cpp -o falcon.o -c 
localai-api-1  | In file included from falcon.cpp:19:
localai-api-1  | implementations/falcon.cpp: In function 'bool falcon_model_load(const string&, falcon_model&, gpt_vocab&)':
localai-api-1  | implementations/falcon.cpp:187:13: warning: C++ designated initializers only available with '-std=c++2a' or '-std=gnu++2a' [-Wpedantic]
localai-api-1  |   187 |             .mem_size   = ctx_size,
localai-api-1  |       |             ^
localai-api-1  | implementations/falcon.cpp:188:13: warning: C++ designated initializers only available with '-std=c++2a' or '-std=gnu++2a' [-Wpedantic]
localai-api-1  |   188 |             .mem_buffer = NULL,
localai-api-1  |       |             ^
localai-api-1  | implementations/falcon.cpp:189:13: warning: C++ designated initializers only available with '-std=c++2a' or '-std=gnu++2a' [-Wpedantic]
localai-api-1  |   189 |             .no_alloc   = false,
localai-api-1  |       |             ^
localai-api-1  | In file included from falcon.cpp:19:
localai-api-1  | implementations/falcon.cpp: In function 'bool falcon_eval(const falcon_model&, int, int, const std::vector<int>&, std::vector<float>&, size_t&)':
localai-api-1  | implementations/falcon.cpp:410:9: warning: C++ designated initializers only available with '-std=c++2a' or '-std=gnu++2a' [-Wpedantic]
localai-api-1  |   410 |         .mem_size   = buf_size,
localai-api-1  |       |         ^
localai-api-1  | implementations/falcon.cpp:411:9: warning: C++ designated initializers only available with '-std=c++2a' or '-std=gnu++2a' [-Wpedantic]
localai-api-1  |   411 |         .mem_buffer = buf,
localai-api-1  |       |         ^
localai-api-1  | implementations/falcon.cpp:412:9: warning: C++ designated initializers only available with '-std=c++2a' or '-std=gnu++2a' [-Wpedantic]
localai-api-1  |   412 |         .no_alloc   = false,
localai-api-1  |       |         ^
localai-api-1  | falcon.cpp: In function 'int falcon_predict(void*, void*, char*)':
localai-api-1  | falcon.cpp:69:34: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |    69 |      for (int i = embd.size(); i < embd_inp.size() + params.n_predict; i++) {
localai-api-1  |       |                                ~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
localai-api-1  | falcon.cpp:85:15: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |    85 |         if (i >= embd_inp.size()) {
localai-api-1  |       |             ~~^~~~~~~~~~~~~~~~~~
localai-api-1  | falcon.cpp:107:31: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |   107 |             for (int k = i; k < embd_inp.size(); k++) {
localai-api-1  |       |                             ~~^~~~~~~~~~~~~~~~~
localai-api-1  | falcon.cpp:109:33: warning: comparison of integer expressions of different signedness: 'std::vector<int>::size_type' {aka 'long unsigned int'} and 'int32_t' {aka 'int'} [-Wsign-compare]
localai-api-1  |   109 |                 if (embd.size() > params.n_batch) {
localai-api-1  |       |                     ~~~~~~~~~~~~^~~~~~~~~~~~~~~~
localai-api-1  | falcon.cpp:36:19: warning: unused variable 't_main_start_us' [-Wunused-variable]
localai-api-1  |    36 |     const int64_t t_main_start_us = ggml_time_us();
localai-api-1  |       |                   ^~~~~~~~~~~~~~~
localai-api-1  | falcon.cpp:48:13: warning: unused variable 't_load_us' [-Wunused-variable]
localai-api-1  |    48 |     int64_t t_load_us = 0;
localai-api-1  |       |             ^~~~~~~~~
localai-api-1  | g++ -I. -I./ggml.cpp/include -I./ggml.cpp/include/ggml/ -I./ggml.cpp/examples/ -O3 -DNDEBUG -std=c++17 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -march=native -mtune=native gptj.cpp -o gptj.o -c 
localai-api-1  | In file included from gptj.cpp:19:
localai-api-1  | ggml.cpp/examples/gpt-j/main.cpp: In function 'int main_gptj(int, char**)':
localai-api-1  | ggml.cpp/examples/gpt-j/main.cpp:674:33: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |   674 |     for (int i = embd.size(); i < embd_inp.size() + params.n_predict; i++) {
localai-api-1  |       |                               ~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
localai-api-1  | ggml.cpp/examples/gpt-j/main.cpp:690:15: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |   690 |         if (i >= embd_inp.size()) {
localai-api-1  |       |             ~~^~~~~~~~~~~~~~~~~~
localai-api-1  | ggml.cpp/examples/gpt-j/main.cpp:712:31: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |   712 |             for (int k = i; k < embd_inp.size(); k++) {
localai-api-1  |       |                             ~~^~~~~~~~~~~~~~~~~
localai-api-1  | ggml.cpp/examples/gpt-j/main.cpp:714:33: warning: comparison of integer expressions of different signedness: 'std::vector<int>::size_type' {aka 'long unsigned int'} and 'int32_t' {aka 'int'} [-Wsign-compare]
localai-api-1  |   714 |                 if (embd.size() > params.n_batch) {
localai-api-1  |       |                     ~~~~~~~~~~~~^~~~~~~~~~~~~~~~
localai-api-1  | gptj.cpp: In function 'int gptj_predict(void*, void*, char*)':
localai-api-1  | gptj.cpp:72:33: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |    72 |     for (int i = embd.size(); i < embd_inp.size() + params.n_predict; i++) {
localai-api-1  |       |                               ~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
localai-api-1  | gptj.cpp:88:15: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |    88 |         if (i >= embd_inp.size()) {
localai-api-1  |       |             ~~^~~~~~~~~~~~~~~~~~
localai-api-1  | gptj.cpp:110:31: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |   110 |             for (int k = i; k < embd_inp.size(); k++) {
localai-api-1  |       |                             ~~^~~~~~~~~~~~~~~~~
localai-api-1  | gptj.cpp:112:33: warning: comparison of integer expressions of different signedness: 'std::vector<int>::size_type' {aka 'long unsigned int'} and 'int32_t' {aka 'int'} [-Wsign-compare]
localai-api-1  |   112 |                 if (embd.size() > params.n_batch) {
localai-api-1  |       |                     ~~~~~~~~~~~~^~~~~~~~~~~~~~~~
localai-api-1  | gptj.cpp:36:19: warning: unused variable 't_main_start_us' [-Wunused-variable]
localai-api-1  |    36 |     const int64_t t_main_start_us = ggml_time_us();
localai-api-1  |       |                   ^~~~~~~~~~~~~~~
localai-api-1  | gptj.cpp:48:13: warning: unused variable 't_load_us' [-Wunused-variable]
localai-api-1  |    48 |     int64_t t_load_us = 0;
localai-api-1  |       |             ^~~~~~~~~
localai-api-1  | g++ -I. -I./ggml.cpp/include -I./ggml.cpp/include/ggml/ -I./ggml.cpp/examples/ -O3 -DNDEBUG -std=c++17 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -march=native -mtune=native mpt.cpp -o mpt.o -c 
localai-api-1  | In file included from mpt.cpp:19:
localai-api-1  | ggml.cpp/examples/mpt/main.cpp: In function 'bool mpt_model_load(const string&, mpt_model&, gpt_vocab&)':
localai-api-1  | ggml.cpp/examples/mpt/main.cpp:246:31: warning: comparison of integer expressions of different signedness: 'int' and 'std::__cxx11::basic_string<wchar_t>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |   246 |             for (int w = 0; w < word_multibytes.size(); w++) {
localai-api-1  |       |                             ~~^~~~~~~~~~~~~~~~~~~~~~~~
localai-api-1  | mpt.cpp: In function 'int mpt_predict(void*, void*, char*)':
localai-api-1  | mpt.cpp:37:19: warning: unused variable 't_main_start_us' [-Wunused-variable]
localai-api-1  |    37 |     const int64_t t_main_start_us = ggml_time_us();
localai-api-1  |       |                   ^~~~~~~~~~~~~~~
localai-api-1  | mpt.cpp:49:13: warning: unused variable 't_load_us' [-Wunused-variable]
localai-api-1  |    49 |     int64_t t_load_us = 0;
localai-api-1  |       |             ^~~~~~~~~
localai-api-1  | mkdir build
localai-api-1  | cd build && cmake ../ggml.cpp  && make VERBOSE=1 ggml && cp -rf src/CMakeFiles/ggml.dir/ggml.c.o ../ggml.o
localai-api-1  | CMake Deprecation Warning at CMakeLists.txt:1 (cmake_minimum_required):
localai-api-1  |   Compatibility with CMake < 3.5 will be removed from a future version of
localai-api-1  |   CMake.
localai-api-1  | 
localai-api-1  |   Update the VERSION argument <min> value or use a ...<max> suffix to tell
localai-api-1  |   CMake that the project does not need compatibility with older versions.
localai-api-1  | 
localai-api-1  | 
localai-api-1  | -- The C compiler identification is GNU 10.2.1
localai-api-1  | -- The CXX compiler identification is GNU 10.2.1
localai-api-1  | -- Detecting C compiler ABI info
localai-api-1  | -- Detecting C compiler ABI info - done
localai-api-1  | -- Check for working C compiler: /usr/bin/cc - skipped
localai-api-1  | -- Detecting C compile features
localai-api-1  | -- Detecting C compile features - done
localai-api-1  | -- Detecting CXX compiler ABI info
localai-api-1  | -- Detecting CXX compiler ABI info - done
localai-api-1  | -- Check for working CXX compiler: /usr/bin/c++ - skipped
localai-api-1  | -- Detecting CXX compile features
localai-api-1  | -- Detecting CXX compile features - done
localai-api-1  | -- Found Git: /usr/bin/git (found version "2.30.2") 
localai-api-1  | -- Performing Test CMAKE_HAVE_LIBC_PTHREAD
localai-api-1  | -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
localai-api-1  | -- Looking for pthread_create in pthreads
localai-api-1  | -- Looking for pthread_create in pthreads - not found
localai-api-1  | -- Looking for pthread_create in pthread
localai-api-1  | -- Looking for pthread_create in pthread - found
localai-api-1  | -- Found Threads: TRUE  
localai-api-1  | -- CMAKE_SYSTEM_PROCESSOR: x86_64
localai-api-1  | -- x86 detected
localai-api-1  | -- Linux detected
localai-api-1  | -- x86 detected
localai-api-1  | -- Linux detected
localai-api-1  | -- Configuring done (1.2s)
localai-api-1  | -- Generating done (0.1s)
localai-api-1  | -- Build files have been written to: /build/go-ggml-transformers/build
localai-api-1  | make[2]: Entering directory '/build/go-ggml-transformers/build'
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -S/build/go-ggml-transformers/ggml.cpp -B/build/go-ggml-transformers/build --check-build-system CMakeFiles/Makefile.cmake 0
localai-api-1  | make  -f CMakeFiles/Makefile2 ggml
localai-api-1  | make[3]: Entering directory '/build/go-ggml-transformers/build'
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -S/build/go-ggml-transformers/ggml.cpp -B/build/go-ggml-transformers/build --check-build-system CMakeFiles/Makefile.cmake 0
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_progress_start /build/go-ggml-transformers/build/CMakeFiles 2
localai-api-1  | make  -f CMakeFiles/Makefile2 src/CMakeFiles/ggml.dir/all
localai-api-1  | make[4]: Entering directory '/build/go-ggml-transformers/build'
localai-api-1  | make  -f src/CMakeFiles/ggml.dir/build.make src/CMakeFiles/ggml.dir/depend
localai-api-1  | make[5]: Entering directory '/build/go-ggml-transformers/build'
localai-api-1  | cd /build/go-ggml-transformers/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-ggml-transformers/ggml.cpp /build/go-ggml-transformers/ggml.cpp/src /build/go-ggml-transformers/build /build/go-ggml-transformers/build/src /build/go-ggml-transformers/build/src/CMakeFiles/ggml.dir/DependInfo.cmake "--color="
localai-api-1  | make[5]: Leaving directory '/build/go-ggml-transformers/build'
localai-api-1  | make  -f src/CMakeFiles/ggml.dir/build.make src/CMakeFiles/ggml.dir/build
localai-api-1  | make[5]: Entering directory '/build/go-ggml-transformers/build'
localai-api-1  | [ 50%] Building C object src/CMakeFiles/ggml.dir/ggml.c.o
localai-api-1  | cd /build/go-ggml-transformers/build/src && /usr/bin/cc  -I/build/go-ggml-transformers/ggml.cpp/src/. -I/build/go-ggml-transformers/ggml.cpp/src/../include -I/build/go-ggml-transformers/ggml.cpp/src/../include/ggml -Wall                                       -Wextra                                     -Wpedantic                                  -Wshadow                                    -Wcast-qual                                 -Wstrict-prototypes                         -Wpointer-arith                             -Wdouble-promotion                          -Wno-unused-function                     -Werror=vla -msse3 -O3 -DNDEBUG -std=gnu11 -MD -MT src/CMakeFiles/ggml.dir/ggml.c.o -MF CMakeFiles/ggml.dir/ggml.c.o.d -o CMakeFiles/ggml.dir/ggml.c.o -c /build/go-ggml-transformers/ggml.cpp/src/ggml.c
localai-api-1  | /build/go-ggml-transformers/ggml.cpp/src/ggml.c: In function 'quantize_row_q8_0':
localai-api-1  | /build/go-ggml-transformers/ggml.cpp/src/ggml.c:1125:15: warning: unused variable 'nb' [-Wunused-variable]
localai-api-1  |  1125 |     const int nb = k / QK8_0;
localai-api-1  |       |               ^~
localai-api-1  | /build/go-ggml-transformers/ggml.cpp/src/ggml.c: In function 'quantize_row_q8_1':
localai-api-1  | /build/go-ggml-transformers/ggml.cpp/src/ggml.c:1320:15: warning: unused variable 'nb' [-Wunused-variable]
localai-api-1  |  1320 |     const int nb = k / QK8_1;
localai-api-1  |       |               ^~
localai-api-1  | [100%] Linking C static library libggml.a
localai-api-1  | cd /build/go-ggml-transformers/build/src && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -P CMakeFiles/ggml.dir/cmake_clean_target.cmake
localai-api-1  | cd /build/go-ggml-transformers/build/src && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/ggml.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/ar qc libggml.a CMakeFiles/ggml.dir/ggml.c.o
localai-api-1  | /usr/bin/ranlib libggml.a
localai-api-1  | make[5]: Leaving directory '/build/go-ggml-transformers/build'
localai-api-1  | [100%] Built target ggml
localai-api-1  | make[4]: Leaving directory '/build/go-ggml-transformers/build'
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_progress_start /build/go-ggml-transformers/build/CMakeFiles 0
localai-api-1  | make[3]: Leaving directory '/build/go-ggml-transformers/build'
localai-api-1  | make[2]: Leaving directory '/build/go-ggml-transformers/build'
localai-api-1  | g++ -I. -I./ggml.cpp/include -I./ggml.cpp/include/ggml/ -I./ggml.cpp/examples/ -O3 -DNDEBUG -std=c++17 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -march=native -mtune=native gpt2.cpp ggml.o -o gpt2.o -c 
localai-api-1  | In file included from gpt2.cpp:16:
localai-api-1  | ggml.cpp/examples/gpt-2/main.cpp: In function 'int main_gpt2(int, char**)':
localai-api-1  | ggml.cpp/examples/gpt-2/main.cpp:770:33: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |   770 |     for (int i = embd.size(); i < embd_inp.size() + params.n_predict; i++) {
localai-api-1  |       |                               ~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
localai-api-1  | ggml.cpp/examples/gpt-2/main.cpp:786:15: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |   786 |         if (i >= embd_inp.size()) {
localai-api-1  |       |             ~~^~~~~~~~~~~~~~~~~~
localai-api-1  | ggml.cpp/examples/gpt-2/main.cpp:808:31: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |   808 |             for (int k = i; k < embd_inp.size(); k++) {
localai-api-1  |       |                             ~~^~~~~~~~~~~~~~~~~
localai-api-1  | ggml.cpp/examples/gpt-2/main.cpp:810:33: warning: comparison of integer expressions of different signedness: 'std::vector<int>::size_type' {aka 'long unsigned int'} and 'int32_t' {aka 'int'} [-Wsign-compare]
localai-api-1  |   810 |                 if (embd.size() >= params.n_batch) {
localai-api-1  |       |                     ~~~~~~~~~~~~^~~~~~~~~~~~~~~~~
localai-api-1  | gpt2.cpp: In function 'int gpt2_predict(void*, void*, char*)':
localai-api-1  | gpt2.cpp:68:33: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |    68 |     for (int i = embd.size(); i < embd_inp.size() + params.n_predict; i++) {
localai-api-1  |       |                               ~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
localai-api-1  | gpt2.cpp:84:15: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |    84 |         if (i >= embd_inp.size()) {
localai-api-1  |       |             ~~^~~~~~~~~~~~~~~~~~
localai-api-1  | gpt2.cpp:106:31: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |   106 |             for (int k = i; k < embd_inp.size(); k++) {
localai-api-1  |       |                             ~~^~~~~~~~~~~~~~~~~
localai-api-1  | gpt2.cpp:108:33: warning: comparison of integer expressions of different signedness: 'std::vector<int>::size_type' {aka 'long unsigned int'} and 'int32_t' {aka 'int'} [-Wsign-compare]
localai-api-1  |   108 |                 if (embd.size() >= params.n_batch) {
localai-api-1  |       |                     ~~~~~~~~~~~~^~~~~~~~~~~~~~~~~
localai-api-1  | gpt2.cpp:34:19: warning: unused variable 't_main_start_us' [-Wunused-variable]
localai-api-1  |    34 |     const int64_t t_main_start_us = ggml_time_us();
localai-api-1  |       |                   ^~~~~~~~~~~~~~~
localai-api-1  | gpt2.cpp:43:13: warning: unused variable 't_load_us' [-Wunused-variable]
localai-api-1  |    43 |     int64_t t_load_us = 0;
localai-api-1  |       |             ^~~~~~~~~
localai-api-1  | g++: warning: ggml.o: linker input file unused because linking not done
localai-api-1  | g++ -I. -I./ggml.cpp/include -I./ggml.cpp/include/ggml/ -I./ggml.cpp/examples/ -O3 -DNDEBUG -std=c++17 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -march=native -mtune=native replit.cpp -o replit.o -c 
localai-api-1  | In file included from replit.cpp:21:
localai-api-1  | ggml.cpp/examples/replit/main.cpp: In function 'std::pair<std::vector<long unsigned int>, float> encode_word(const string&, const piece_map_t&)':
localai-api-1  | ggml.cpp/examples/replit/main.cpp:54:39: warning: comparison of integer expressions of different signedness: 'int' and 'std::__cxx11::basic_string<char>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |    54 |     for (int start_idx = 0; start_idx < word.length(); ++start_idx) {
localai-api-1  |       |                             ~~~~~~~~~~^~~~~~~~~~~~~~~
localai-api-1  | ggml.cpp/examples/replit/main.cpp:56:51: warning: comparison of integer expressions of different signedness: 'int' and 'std::__cxx11::basic_string<char>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |    56 |         for (int end_idx = start_idx + 1; end_idx <= word.length(); ++end_idx) {
localai-api-1  |       |                                           ~~~~~~~~^~~~~~~~~~~~~~~~
localai-api-1  | ggml.cpp/examples/replit/main.cpp: In function 'bool replit_tokenizer_load(replit_tokenizer&, std::istream&, int)':
localai-api-1  | ggml.cpp/examples/replit/main.cpp:94:31: warning: comparison of integer expressions of different signedness: 'std::size_t' {aka 'long unsigned int'} and 'int' [-Wsign-compare]
localai-api-1  |    94 |     for (std::size_t i = 0; i < max_vocab_size; i++) {
localai-api-1  |       |                             ~~^~~~~~~~~~~~~~~~
localai-api-1  | ggml.cpp/examples/replit/main.cpp: In function 'bool replit_model_load(const string&, replit_model&, replit_tokenizer&)':
localai-api-1  | ggml.cpp/examples/replit/main.cpp:345:56: warning: format '%lld' expects argument of type 'long long int', but argument 4 has type 'long int' [-Wformat=]
localai-api-1  |   345 |         printf("%s: memory_size = %8.2f MB, n_mem = %lld\n", __func__, memory_size / 1024.0 / 1024.0, n_mem);
localai-api-1  |       |                                                     ~~~^                                              ~~~~~
localai-api-1  |       |                                                        |                                              |
localai-api-1  |       |                                                        long long int                                  long int
localai-api-1  |       |                                                     %ld
localai-api-1  | ggml.cpp/examples/replit/main.cpp: In function 'int main_replit(int, char**)':
localai-api-1  | ggml.cpp/examples/replit/main.cpp:704:23: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<long unsigned int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |   704 |     for (int i = 0; i < embd_inp.size(); i++) {
localai-api-1  |       |                     ~~^~~~~~~~~~~~~~~~~
localai-api-1  | ggml.cpp/examples/replit/main.cpp:718:33: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<long unsigned int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |   718 |     for (int i = embd.size(); i < embd_inp.size() + params.n_predict; i++) {
localai-api-1  |       |                               ~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
localai-api-1  | ggml.cpp/examples/replit/main.cpp:734:15: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<long unsigned int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |   734 |         if (i >= embd_inp.size()) {
localai-api-1  |       |             ~~^~~~~~~~~~~~~~~~~~
localai-api-1  | ggml.cpp/examples/replit/main.cpp:757:31: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<long unsigned int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |   757 |             for (int k = i; k < embd_inp.size(); k++) {
localai-api-1  |       |                             ~~^~~~~~~~~~~~~~~~~
localai-api-1  | ggml.cpp/examples/replit/main.cpp:759:33: warning: comparison of integer expressions of different signedness: 'std::vector<int>::size_type' {aka 'long unsigned int'} and 'int32_t' {aka 'int'} [-Wsign-compare]
localai-api-1  |   759 |                 if (embd.size() > params.n_batch) {
localai-api-1  |       |                     ~~~~~~~~~~~~^~~~~~~~~~~~~~~~
localai-api-1  | replit.cpp: In function 'int replit_predict(void*, void*, char*)':
localai-api-1  | replit.cpp:64:21: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<long unsigned int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |    64 |   for (int i = 0; i < embd_inp.size(); i++) {
localai-api-1  |       |                   ~~^~~~~~~~~~~~~~~~~
localai-api-1  | replit.cpp:65:31: warning: format '%d' expects argument of type 'int', but argument 4 has type '__gnu_cxx::__alloc_traits<std::allocator<long unsigned int>, long unsigned int>::value_type' {aka 'long unsigned int'} [-Wformat=]
localai-api-1  |    65 |     printf("%s: token[%d] = %6d\n", __func__, i, embd_inp[i]);
localai-api-1  |       |                             ~~^
localai-api-1  |       |                               |
localai-api-1  |       |                               int
localai-api-1  |       |                             %6ld
localai-api-1  | replit.cpp:80:31: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<long unsigned int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |    80 |   for (int i = embd.size(); i < embd_inp.size() + params.n_predict; i++) {
localai-api-1  |       |                             ~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
localai-api-1  | replit.cpp:96:11: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<long unsigned int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |    96 |     if (i >= embd_inp.size()) {
localai-api-1  |       |         ~~^~~~~~~~~~~~~~~~~~
localai-api-1  | replit.cpp:120:25: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<long unsigned int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |   120 |       for (int k = i; k < embd_inp.size(); k++) {
localai-api-1  |       |                       ~~^~~~~~~~~~~~~~~~~
localai-api-1  | replit.cpp:122:25: warning: comparison of integer expressions of different signedness: 'std::vector<int>::size_type' {aka 'long unsigned int'} and 'int32_t' {aka 'int'} [-Wsign-compare]
localai-api-1  |   122 |         if (embd.size() > params.n_batch) {
localai-api-1  |       |             ~~~~~~~~~~~~^~~~~~~~~~~~~~~~
localai-api-1  | replit.cpp:39:17: warning: unused variable 't_main_start_us' [-Wunused-variable]
localai-api-1  |    39 |   const int64_t t_main_start_us = ggml_time_us();
localai-api-1  |       |                 ^~~~~~~~~~~~~~~
localai-api-1  | replit.cpp:49:11: warning: unused variable 't_load_us' [-Wunused-variable]
localai-api-1  |    49 |   int64_t t_load_us = 0;
localai-api-1  |       |           ^~~~~~~~~
localai-api-1  | g++ -I. -I./ggml.cpp/include -I./ggml.cpp/include/ggml/ -I./ggml.cpp/examples/ -O3 -DNDEBUG -std=c++17 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -march=native -mtune=native gptneox.cpp -o gptneox.o -c 
localai-api-1  | In file included from gptneox.cpp:19:
localai-api-1  | ggml.cpp/examples/gpt-neox/main.cpp: In function 'int main_gptneox(int, char**)':
localai-api-1  | ggml.cpp/examples/gpt-neox/main.cpp:728:23: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |   728 |     for (int i = 0; i < embd_inp.size(); i++) {
localai-api-1  |       |                     ~~^~~~~~~~~~~~~~~~~
localai-api-1  | ggml.cpp/examples/gpt-neox/main.cpp:739:33: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |   739 |     for (int i = embd.size(); i < embd_inp.size() + params.n_predict; i++) {
localai-api-1  |       |                               ~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
localai-api-1  | ggml.cpp/examples/gpt-neox/main.cpp:755:15: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |   755 |         if (i >= embd_inp.size()) {
localai-api-1  |       |             ~~^~~~~~~~~~~~~~~~~~
localai-api-1  | ggml.cpp/examples/gpt-neox/main.cpp:777:31: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |   777 |             for (int k = i; k < embd_inp.size(); k++) {
localai-api-1  |       |                             ~~^~~~~~~~~~~~~~~~~
localai-api-1  | ggml.cpp/examples/gpt-neox/main.cpp:779:33: warning: comparison of integer expressions of different signedness: 'std::vector<int>::size_type' {aka 'long unsigned int'} and 'int32_t' {aka 'int'} [-Wsign-compare]
localai-api-1  |   779 |                 if (embd.size() > params.n_batch) {
localai-api-1  |       |                     ~~~~~~~~~~~~^~~~~~~~~~~~~~~~
localai-api-1  | gptneox.cpp: In function 'int gpt_neox_predict(void*, void*, char*)':
localai-api-1  | gptneox.cpp:71:33: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |    71 |     for (int i = embd.size(); i < embd_inp.size() + params.n_predict; i++) {
localai-api-1  |       |                               ~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
localai-api-1  | gptneox.cpp:87:15: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |    87 |         if (i >= embd_inp.size()) {
localai-api-1  |       |             ~~^~~~~~~~~~~~~~~~~~
localai-api-1  | gptneox.cpp:109:31: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |   109 |             for (int k = i; k < embd_inp.size(); k++) {
localai-api-1  |       |                             ~~^~~~~~~~~~~~~~~~~
localai-api-1  | gptneox.cpp:111:33: warning: comparison of integer expressions of different signedness: 'std::vector<int>::size_type' {aka 'long unsigned int'} and 'int32_t' {aka 'int'} [-Wsign-compare]
localai-api-1  |   111 |                 if (embd.size() > params.n_batch) {
localai-api-1  |       |                     ~~~~~~~~~~~~^~~~~~~~~~~~~~~~
localai-api-1  | gptneox.cpp:36:19: warning: unused variable 't_main_start_us' [-Wunused-variable]
localai-api-1  |    36 |     const int64_t t_main_start_us = ggml_time_us();
localai-api-1  |       |                   ^~~~~~~~~~~~~~~
localai-api-1  | gptneox.cpp:48:13: warning: unused variable 't_load_us' [-Wunused-variable]
localai-api-1  |    48 |     int64_t t_load_us = 0;
localai-api-1  |       |             ^~~~~~~~~
localai-api-1  | g++ -I. -I./ggml.cpp/include -I./ggml.cpp/include/ggml/ -I./ggml.cpp/examples/ -O3 -DNDEBUG -std=c++17 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -march=native -mtune=native dolly.cpp -o dolly.o -c 
localai-api-1  | In file included from dolly.cpp:18:
localai-api-1  | ggml.cpp/examples/dolly-v2/main.cpp: In function 'int main_dolly(int, char**)':
localai-api-1  | ggml.cpp/examples/dolly-v2/main.cpp:731:23: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |   731 |     for (int i = 0; i < embd_inp.size(); i++) {
localai-api-1  |       |                     ~~^~~~~~~~~~~~~~~~~
localai-api-1  | ggml.cpp/examples/dolly-v2/main.cpp:744:33: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |   744 |     for (int i = embd.size(); i < embd_inp.size() + params.n_predict; i++) {
localai-api-1  |       |                               ~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
localai-api-1  | ggml.cpp/examples/dolly-v2/main.cpp:760:15: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |   760 |         if (i >= embd_inp.size()) {
localai-api-1  |       |             ~~^~~~~~~~~~~~~~~~~~
localai-api-1  | ggml.cpp/examples/dolly-v2/main.cpp:783:31: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |   783 |             for (int k = i; k < embd_inp.size(); k++) {
localai-api-1  |       |                             ~~^~~~~~~~~~~~~~~~~
localai-api-1  | ggml.cpp/examples/dolly-v2/main.cpp:785:33: warning: comparison of integer expressions of different signedness: 'std::vector<int>::size_type' {aka 'long unsigned int'} and 'int32_t' {aka 'int'} [-Wsign-compare]
localai-api-1  |   785 |                 if (embd.size() > params.n_batch) {
localai-api-1  |       |                     ~~~~~~~~~~~~^~~~~~~~~~~~~~~~
localai-api-1  | dolly.cpp: In function 'int dolly_predict(void*, void*, char*)':
localai-api-1  | dolly.cpp:70:33: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |    70 |     for (int i = embd.size(); i < embd_inp.size() + params.n_predict; i++) {
localai-api-1  |       |                               ~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
localai-api-1  | dolly.cpp:86:15: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |    86 |         if (i >= embd_inp.size()) {
localai-api-1  |       |             ~~^~~~~~~~~~~~~~~~~~
localai-api-1  | dolly.cpp:109:31: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<int>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |   109 |             for (int k = i; k < embd_inp.size(); k++) {
localai-api-1  |       |                             ~~^~~~~~~~~~~~~~~~~
localai-api-1  | dolly.cpp:111:33: warning: comparison of integer expressions of different signedness: 'std::vector<int>::size_type' {aka 'long unsigned int'} and 'int32_t' {aka 'int'} [-Wsign-compare]
localai-api-1  |   111 |                 if (embd.size() > params.n_batch) {
localai-api-1  |       |                     ~~~~~~~~~~~~^~~~~~~~~~~~~~~~
localai-api-1  | dolly.cpp:46:13: warning: unused variable 't_load_us' [-Wunused-variable]
localai-api-1  |    46 |     int64_t t_load_us = 0;
localai-api-1  |       |             ^~~~~~~~~
localai-api-1  | g++ -I. -I./ggml.cpp/include -I./ggml.cpp/include/ggml/ -I./ggml.cpp/examples/ -O3 -DNDEBUG -std=c++17 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -march=native -mtune=native -c ggml.cpp/examples/common-ggml.cpp -o common-ggml.o
localai-api-1  | g++ -I. -I./ggml.cpp/include -I./ggml.cpp/include/ggml/ -I./ggml.cpp/examples/ -O3 -DNDEBUG -std=c++17 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -march=native -mtune=native -c ggml.cpp/examples/common.cpp -o common.o
localai-api-1  | ar src libtransformers.a replit.o gptj.o mpt.o gptneox.o starcoder.o gpt2.o dolly.o  falcon.o  ggml.o common-ggml.o common.o 
localai-api-1  | make[1]: Leaving directory '/build/go-ggml-transformers'
localai-api-1  | CGO_LDFLAGS="" C_INCLUDE_PATH=/build/go-ggml-transformers LIBRARY_PATH=/build/go-ggml-transformers \
localai-api-1  | go build -ldflags "-X "github.com/go-skynet/LocalAI/internal.Version=v1.23.2" -X "github.com/go-skynet/LocalAI/internal.Commit=acd829a7a0e1623c0871c8b34c36c76afd4feac8"" -tags "" -o backend-assets/grpc/falcon-ggml ./cmd/grpc/falcon-ggml/
localai-api-1  | make -C go-bert libgobert.a
localai-api-1  | make[1]: Entering directory '/build/go-bert'
localai-api-1  | I go-gpt4all-j build info: 
localai-api-1  | I UNAME_S:  Linux
localai-api-1  | I UNAME_P:  unknown
localai-api-1  | I UNAME_M:  x86_64
localai-api-1  | I CFLAGS:   -I. -I./bert.cpp/ggml/include/ggml/ -I./bert.cpp/ -I -O3 -DNDEBUG -std=c11 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wdouble-promotion -Wshadow -Wstrict-prototypes -Wpointer-arith -Wno-unused-function -pthread -march=native -mtune=native
localai-api-1  | I CXXFLAGS: -I. -I./bert.cpp/ggml/include/ggml/ -I./bert.cpp/ -O3 -DNDEBUG -std=c++17 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -march=native -mtune=native
localai-api-1  | I LDFLAGS:  
localai-api-1  | I CMAKEFLAGS:  
localai-api-1  | I CC:       cc (Debian 10.2.1-6) 10.2.1 20210110
localai-api-1  | I CXX:      g++ (Debian 10.2.1-6) 10.2.1 20210110
localai-api-1  | 
localai-api-1  | cd bert.cpp && mkdir build
localai-api-1  | sed "s/#include <regex>/#include <regex>\n#include <unordered_map>/" bert.cpp/bert.cpp > bert.cpp/bert.tmp && mv bert.cpp/bert.tmp bert.cpp/bert.cpp
localai-api-1  | cd bert.cpp/build && cmake .. -DBUILD_SHARED_LIBS=OFF -DCMAKE_BUILD_TYPE=Release && make
localai-api-1  | -- The C compiler identification is GNU 10.2.1
localai-api-1  | -- The CXX compiler identification is GNU 10.2.1
localai-api-1  | -- Detecting C compiler ABI info
localai-api-1  | -- Detecting C compiler ABI info - done
localai-api-1  | -- Check for working C compiler: /usr/bin/cc - skipped
localai-api-1  | -- Detecting C compile features
localai-api-1  | -- Detecting C compile features - done
localai-api-1  | -- Detecting CXX compiler ABI info
localai-api-1  | -- Detecting CXX compiler ABI info - done
localai-api-1  | -- Check for working CXX compiler: /usr/bin/c++ - skipped
localai-api-1  | -- Detecting CXX compile features
localai-api-1  | -- Detecting CXX compile features - done
localai-api-1  | -- Performing Test CMAKE_HAVE_LIBC_PTHREAD
localai-api-1  | -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
localai-api-1  | -- Check if compiler accepts -pthread
localai-api-1  | -- Check if compiler accepts -pthread - yes
localai-api-1  | -- Found Threads: TRUE  
localai-api-1  | -- CMAKE_SYSTEM_PROCESSOR: x86_64
localai-api-1  | -- x86 detected
localai-api-1  | CMake Deprecation Warning at ggml/CMakeLists.txt:1 (cmake_minimum_required):
localai-api-1  |   Compatibility with CMake < 3.5 will be removed from a future version of
localai-api-1  |   CMake.
localai-api-1  | 
localai-api-1  |   Update the VERSION argument <min> value or use a ...<max> suffix to tell
localai-api-1  |   CMake that the project does not need compatibility with older versions.
localai-api-1  | 
localai-api-1  | 
localai-api-1  | -- CMAKE_SYSTEM_PROCESSOR: x86_64
localai-api-1  | -- x86 detected
localai-api-1  | -- Linux detected
localai-api-1  | -- Configuring done (0.8s)
localai-api-1  | -- Generating done (0.0s)
localai-api-1  | -- Build files have been written to: /build/go-bert/bert.cpp/build
localai-api-1  | make[2]: Entering directory '/build/go-bert/bert.cpp/build'
localai-api-1  | make[3]: Entering directory '/build/go-bert/bert.cpp/build'
localai-api-1  | make[4]: Entering directory '/build/go-bert/bert.cpp/build'
localai-api-1  | make[4]: Leaving directory '/build/go-bert/bert.cpp/build'
localai-api-1  | make[4]: Entering directory '/build/go-bert/bert.cpp/build'
localai-api-1  | [  8%] Building C object ggml/src/CMakeFiles/ggml.dir/ggml.c.o
localai-api-1  | [ 16%] Linking C static library libggml.a
localai-api-1  | make[4]: Leaving directory '/build/go-bert/bert.cpp/build'
localai-api-1  | [ 16%] Built target ggml
localai-api-1  | make[4]: Entering directory '/build/go-bert/bert.cpp/build'
localai-api-1  | make[4]: Leaving directory '/build/go-bert/bert.cpp/build'
localai-api-1  | make[4]: Entering directory '/build/go-bert/bert.cpp/build'
localai-api-1  | [ 25%] Building CXX object CMakeFiles/bert.dir/bert.cpp.o
localai-api-1  | [ 33%] Linking CXX static library libbert.a
localai-api-1  | make[4]: Leaving directory '/build/go-bert/bert.cpp/build'
localai-api-1  | [ 33%] Built target bert
localai-api-1  | make[4]: Entering directory '/build/go-bert/bert.cpp/build'
localai-api-1  | make[4]: Leaving directory '/build/go-bert/bert.cpp/build'
localai-api-1  | make[4]: Entering directory '/build/go-bert/bert.cpp/build'
localai-api-1  | [ 41%] Building CXX object examples/CMakeFiles/server.dir/server.cpp.o
localai-api-1  | [ 50%] Linking CXX executable ../bin/server
localai-api-1  | make[4]: Leaving directory '/build/go-bert/bert.cpp/build'
localai-api-1  | [ 50%] Built target server
localai-api-1  | make[4]: Entering directory '/build/go-bert/bert.cpp/build'
localai-api-1  | make[4]: Leaving directory '/build/go-bert/bert.cpp/build'
localai-api-1  | make[4]: Entering directory '/build/go-bert/bert.cpp/build'
localai-api-1  | [ 58%] Building CXX object examples/CMakeFiles/main.dir/main.cpp.o
localai-api-1  | [ 66%] Linking CXX executable ../bin/main
localai-api-1  | make[4]: Leaving directory '/build/go-bert/bert.cpp/build'
localai-api-1  | [ 66%] Built target main
localai-api-1  | make[4]: Entering directory '/build/go-bert/bert.cpp/build'
localai-api-1  | make[4]: Leaving directory '/build/go-bert/bert.cpp/build'
localai-api-1  | make[4]: Entering directory '/build/go-bert/bert.cpp/build'
localai-api-1  | [ 75%] Building CXX object examples/CMakeFiles/test_tokenizer.dir/test_tokenizer.cpp.o
localai-api-1  | [ 83%] Linking CXX executable ../bin/test_tokenizer
localai-api-1  | make[4]: Leaving directory '/build/go-bert/bert.cpp/build'
localai-api-1  | [ 83%] Built target test_tokenizer
localai-api-1  | make[4]: Entering directory '/build/go-bert/bert.cpp/build'
localai-api-1  | make[4]: Leaving directory '/build/go-bert/bert.cpp/build'
localai-api-1  | make[4]: Entering directory '/build/go-bert/bert.cpp/build'
localai-api-1  | [ 91%] Building CXX object models/CMakeFiles/quantize.dir/quantize.cpp.o
localai-api-1  | [100%] Linking CXX executable ../bin/quantize
localai-api-1  | make[4]: Leaving directory '/build/go-bert/bert.cpp/build'
localai-api-1  | [100%] Built target quantize
localai-api-1  | make[3]: Leaving directory '/build/go-bert/bert.cpp/build'
localai-api-1  | make[2]: Leaving directory '/build/go-bert/bert.cpp/build'
localai-api-1  | cp bert.cpp/build/CMakeFiles/bert.dir/bert.cpp.o bert.o
localai-api-1  | g++ -I. -I./bert.cpp/ggml/include/ggml/ -I./bert.cpp/ -O3 -DNDEBUG -std=c++17 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -march=native -mtune=native gobert.cpp -o gobert.o -c 
localai-api-1  | In file included from gobert.cpp:6:
localai-api-1  | ./bert.cpp/bert.cpp: In function 'bert_ctx* bert_load_from_file(const char*)':
localai-api-1  | ./bert.cpp/bert.cpp:470:13: warning: C++ designated initializers only available with '-std=c++2a' or '-std=gnu++2a' [-Wpedantic]
localai-api-1  |   470 |             .mem_size = model_mem_req,
localai-api-1  |       |             ^
localai-api-1  | ./bert.cpp/bert.cpp:471:13: warning: C++ designated initializers only available with '-std=c++2a' or '-std=gnu++2a' [-Wpedantic]
localai-api-1  |   471 |             .mem_buffer = NULL,
localai-api-1  |       |             ^
localai-api-1  | ./bert.cpp/bert.cpp:472:13: warning: C++ designated initializers only available with '-std=c++2a' or '-std=gnu++2a' [-Wpedantic]
localai-api-1  |   472 |             .no_alloc = false,
localai-api-1  |       |             ^
localai-api-1  | ./bert.cpp/bert.cpp:610:89: warning: format '%lld' expects argument of type 'long long int', but argument 5 has type 'int64_t' {aka 'long int'} [-Wformat=]
localai-api-1  |   610 |                 fprintf(stderr, "%s: tensor '%s' has wrong shape in model file: got [%lld, %lld], expected [%lld, %lld]\n",
localai-api-1  |       |                                                                                      ~~~^
localai-api-1  |       |                                                                                         |
localai-api-1  |       |                                                                                         long long int
localai-api-1  |       |                                                                                      %ld
localai-api-1  |   611 |                         __func__, name.data(), tensor->ne[0], tensor->ne[1], ne[0], ne[1]);
localai-api-1  |       |                                                ~~~~~~~~~~~~~                             
localai-api-1  |       |                                                            |
localai-api-1  |       |                                                            int64_t {aka long int}
localai-api-1  | ./bert.cpp/bert.cpp:610:95: warning: format '%lld' expects argument of type 'long long int', but argument 6 has type 'int64_t' {aka 'long int'} [-Wformat=]
localai-api-1  |   610 |                 fprintf(stderr, "%s: tensor '%s' has wrong shape in model file: got [%lld, %lld], expected [%lld, %lld]\n",
localai-api-1  |       |                                                                                            ~~~^
localai-api-1  |       |                                                                                               |
localai-api-1  |       |                                                                                               long long int
localai-api-1  |       |                                                                                            %ld
localai-api-1  |   611 |                         __func__, name.data(), tensor->ne[0], tensor->ne[1], ne[0], ne[1]);
localai-api-1  |       |                                                               ~~~~~~~~~~~~~                    
localai-api-1  |       |                                                                           |
localai-api-1  |       |                                                                           int64_t {aka long int}
localai-api-1  | ./bert.cpp/bert.cpp:610:112: warning: format '%lld' expects argument of type 'long long int', but argument 7 has type 'int64_t' {aka 'long int'} [-Wformat=]
localai-api-1  |   610 |                 fprintf(stderr, "%s: tensor '%s' has wrong shape in model file: got [%lld, %lld], expected [%lld, %lld]\n",
localai-api-1  |       |                                                                                                             ~~~^
localai-api-1  |       |                                                                                                                |
localai-api-1  |       |                                                                                                                long long int
localai-api-1  |       |                                                                                                             %ld
localai-api-1  |   611 |                         __func__, name.data(), tensor->ne[0], tensor->ne[1], ne[0], ne[1]);
localai-api-1  |       |                                                                              ~~~~~                              
localai-api-1  |       |                                                                                  |
localai-api-1  |       |                                                                                  int64_t {aka long int}
localai-api-1  | ./bert.cpp/bert.cpp:610:118: warning: format '%lld' expects argument of type 'long long int', but argument 8 has type 'int64_t' {aka 'long int'} [-Wformat=]
localai-api-1  |   610 |                 fprintf(stderr, "%s: tensor '%s' has wrong shape in model file: got [%lld, %lld], expected [%lld, %lld]\n",
localai-api-1  |       |                                                                                                                   ~~~^
localai-api-1  |       |                                                                                                                      |
localai-api-1  |       |                                                                                                                      long long int
localai-api-1  |       |                                                                                                                   %ld
localai-api-1  |   611 |                         __func__, name.data(), tensor->ne[0], tensor->ne[1], ne[0], ne[1]);
localai-api-1  |       |                                                                                     ~~~~~                             
localai-api-1  |       |                                                                                         |
localai-api-1  |       |                                                                                         int64_t {aka long int}
localai-api-1  | ./bert.cpp/bert.cpp:624:37: warning: format '%lld' expects argument of type 'long long int', but argument 3 has type 'int64_t' {aka 'long int'} [-Wformat=]
localai-api-1  |   624 |                 printf("%24s - [%5lld, %5lld], type = %6s, %6.2f MB, %9zu bytes\n", name.data(), ne[0], ne[1], ftype_str[ftype], ggml_nbytes(tensor) / 1024.0 / 1024.0, ggml_nbytes(tensor));
localai-api-1  |       |                                 ~~~~^                                                            ~~~~~
localai-api-1  |       |                                     |                                                                |
localai-api-1  |       |                                     long long int                                                    int64_t {aka long int}
localai-api-1  |       |                                 %5ld
localai-api-1  | ./bert.cpp/bert.cpp:624:44: warning: format '%lld' expects argument of type 'long long int', but argument 4 has type 'int64_t' {aka 'long int'} [-Wformat=]
localai-api-1  |   624 |                 printf("%24s - [%5lld, %5lld], type = %6s, %6.2f MB, %9zu bytes\n", name.data(), ne[0], ne[1], ftype_str[ftype], ggml_nbytes(tensor) / 1024.0 / 1024.0, ggml_nbytes(tensor));
localai-api-1  |       |                                        ~~~~^                                                            ~~~~~
localai-api-1  |       |                                            |                                                                |
localai-api-1  |       |                                            long long int                                                    int64_t {aka long int}
localai-api-1  |       |                                        %5ld
localai-api-1  | ./bert.cpp/bert.cpp:655:101: warning: format '%llu' expects argument of type 'long long unsigned int', but argument 6 has type 'long unsigned int' [-Wformat=]
localai-api-1  |   655 |                 fprintf(stderr, "%s: tensor '%s' has wrong size in model file: got %zu, expected %llu\n",
localai-api-1  |       |                                                                                                  ~~~^
localai-api-1  |       |                                                                                                     |
localai-api-1  |       |                                                                                                     long long unsigned int
localai-api-1  |       |                                                                                                  %lu
localai-api-1  |   656 |                         __func__, name.data(), ggml_nbytes(tensor), nelements * bpe);
localai-api-1  |       |                                                                     ~~~~~~~~~~~~~~~                  
localai-api-1  |       |                                                                               |
localai-api-1  |       |                                                                               long unsigned int
localai-api-1  | ./bert.cpp/bert.cpp:692:56: warning: format '%lld' expects argument of type 'long long int', but argument 4 has type 'int64_t' {aka 'long int'} [-Wformat=]
localai-api-1  |   692 |     printf("%s: mem_per_token %zd KB, mem_per_input %lld MB\n", __func__, new_bert->mem_per_token / (1 << 10), new_bert->mem_per_input / (1 << 20));
localai-api-1  |       |                                                     ~~~^                                                       ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
localai-api-1  |       |                                                        |                                                                               |
localai-api-1  |       |                                                        long long int                                                                   int64_t {aka long int}
localai-api-1  |       |                                                     %ld
localai-api-1  | ./bert.cpp/bert.cpp: In function 'void bert_eval_batch(bert_ctx*, int32_t, int32_t, bert_vocab_id**, int32_t*, float**)':
localai-api-1  | ./bert.cpp/bert.cpp:776:13: warning: C++ designated initializers only available with '-std=c++2a' or '-std=gnu++2a' [-Wpedantic]
localai-api-1  |   776 |             .mem_size = buf_compute.size,
localai-api-1  |       |             ^
localai-api-1  | ./bert.cpp/bert.cpp:777:13: warning: C++ designated initializers only available with '-std=c++2a' or '-std=gnu++2a' [-Wpedantic]
localai-api-1  |   777 |             .mem_buffer = buf_compute.data,
localai-api-1  |       |             ^
localai-api-1  | ./bert.cpp/bert.cpp:778:13: warning: C++ designated initializers only available with '-std=c++2a' or '-std=gnu++2a' [-Wpedantic]
localai-api-1  |   778 |             .no_alloc = false,
localai-api-1  |       |             ^
localai-api-1  | gobert.cpp: In function 'int bert_token_embeddings(void*, void*, int*, int, float*)':
localai-api-1  | gobert.cpp:32:23: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<float>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |    32 |     for (int i = 0; i < embeddings.size(); i++) {
localai-api-1  |       |                     ~~^~~~~~~~~~~~~~~~~~~
localai-api-1  | gobert.cpp:19:19: warning: unused variable 't_main_start_us' [-Wunused-variable]
localai-api-1  |    19 |     const int64_t t_main_start_us = ggml_time_us();
localai-api-1  |       |                   ^~~~~~~~~~~~~~~
localai-api-1  | gobert.cpp:24:9: warning: unused variable 'N' [-Wunused-variable]
localai-api-1  |    24 |     int N = bert_n_max_tokens(bctx);
localai-api-1  |       |         ^
localai-api-1  | gobert.cpp: In function 'int bert_embeddings(void*, void*, float*)':
localai-api-1  | gobert.cpp:53:23: warning: comparison of integer expressions of different signedness: 'int' and 'std::vector<float>::size_type' {aka 'long unsigned int'} [-Wsign-compare]
localai-api-1  |    53 |     for (int i = 0; i < embeddings.size(); i++) {
localai-api-1  |       |                     ~~^~~~~~~~~~~~~~~~~~~
localai-api-1  | gobert.cpp:39:19: warning: unused variable 't_main_start_us' [-Wunused-variable]
localai-api-1  |    39 |     const int64_t t_main_start_us = ggml_time_us();
localai-api-1  |       |                   ^~~~~~~~~~~~~~~
localai-api-1  | cd bert.cpp/build && make VERBOSE=1 ggml && cp -rf ggml/src/CMakeFiles/ggml.dir/ggml.c.o ../../ggml.o
localai-api-1  | make[2]: Entering directory '/build/go-bert/bert.cpp/build'
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -S/build/go-bert/bert.cpp -B/build/go-bert/bert.cpp/build --check-build-system CMakeFiles/Makefile.cmake 0
localai-api-1  | make  -f CMakeFiles/Makefile2 ggml
localai-api-1  | make[3]: Entering directory '/build/go-bert/bert.cpp/build'
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -S/build/go-bert/bert.cpp -B/build/go-bert/bert.cpp/build --check-build-system CMakeFiles/Makefile.cmake 0
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_progress_start /build/go-bert/bert.cpp/build/CMakeFiles 2
localai-api-1  | make  -f CMakeFiles/Makefile2 ggml/src/CMakeFiles/ggml.dir/all
localai-api-1  | make[4]: Entering directory '/build/go-bert/bert.cpp/build'
localai-api-1  | make  -f ggml/src/CMakeFiles/ggml.dir/build.make ggml/src/CMakeFiles/ggml.dir/depend
localai-api-1  | make[5]: Entering directory '/build/go-bert/bert.cpp/build'
localai-api-1  | cd /build/go-bert/bert.cpp/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-bert/bert.cpp /build/go-bert/bert.cpp/ggml/src /build/go-bert/bert.cpp/build /build/go-bert/bert.cpp/build/ggml/src /build/go-bert/bert.cpp/build/ggml/src/CMakeFiles/ggml.dir/DependInfo.cmake "--color="
localai-api-1  | Dependencies file "ggml/src/CMakeFiles/ggml.dir/ggml.c.o.d" is newer than depends file "/build/go-bert/bert.cpp/build/ggml/src/CMakeFiles/ggml.dir/compiler_depend.internal".
localai-api-1  | Consolidate compiler generated dependencies of target ggml
localai-api-1  | make[5]: Leaving directory '/build/go-bert/bert.cpp/build'
localai-api-1  | make  -f ggml/src/CMakeFiles/ggml.dir/build.make ggml/src/CMakeFiles/ggml.dir/build
localai-api-1  | make[5]: Entering directory '/build/go-bert/bert.cpp/build'
localai-api-1  | make[5]: Nothing to be done for 'ggml/src/CMakeFiles/ggml.dir/build'.
localai-api-1  | make[5]: Leaving directory '/build/go-bert/bert.cpp/build'
localai-api-1  | [100%] Built target ggml
localai-api-1  | make[4]: Leaving directory '/build/go-bert/bert.cpp/build'
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_progress_start /build/go-bert/bert.cpp/build/CMakeFiles 0
localai-api-1  | make[3]: Leaving directory '/build/go-bert/bert.cpp/build'
localai-api-1  | make[2]: Leaving directory '/build/go-bert/bert.cpp/build'
localai-api-1  | ar src libgobert.a gobert.o ggml.o 
localai-api-1  | make[1]: Leaving directory '/build/go-bert'
localai-api-1  | CGO_LDFLAGS="" C_INCLUDE_PATH=/build/go-bert LIBRARY_PATH=/build/go-bert \
localai-api-1  | go build -ldflags "-X "github.com/go-skynet/LocalAI/internal.Version=v1.23.2" -X "github.com/go-skynet/LocalAI/internal.Commit=acd829a7a0e1623c0871c8b34c36c76afd4feac8"" -tags "" -o backend-assets/grpc/bert-embeddings ./cmd/grpc/bert-embeddings/
localai-api-1  | make -C go-ggllm BUILD_TYPE= libggllm.a
localai-api-1  | make[1]: Entering directory '/build/go-ggllm'
localai-api-1  | I ggllm.cpp build info: 
localai-api-1  | I UNAME_S:  Linux
localai-api-1  | I UNAME_P:  unknown
localai-api-1  | I UNAME_M:  x86_64
localai-api-1  | I CFLAGS:   -I./ggllm.cpp -I. -O3 -DNDEBUG -std=c11 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wdouble-promotion -Wshadow -Wstrict-prototypes -Wpointer-arith -Wno-unused-function -pthread -march=native -mtune=native
localai-api-1  | I CXXFLAGS: -I./ggllm.cpp -I. -I./ggllm.cpp/examples -I./examples -O3 -DNDEBUG -std=c++11 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -pthread
localai-api-1  | I CGO_LDFLAGS:  
localai-api-1  | I LDFLAGS:  
localai-api-1  | I BUILD_TYPE:  
localai-api-1  | I CMAKE_ARGS:  -DLLAMA_F16C=OFF -DLLAMA_AVX512=OFF -DLLAMA_AVX2=OFF -DLLAMA_AVX=OFF -DLLAMA_FMA=OFF
localai-api-1  | I EXTRA_TARGETS:  
localai-api-1  | I CC:       cc (Debian 10.2.1-6) 10.2.1 20210110
localai-api-1  | I CXX:      g++ (Debian 10.2.1-6) 10.2.1 20210110
localai-api-1  | 
localai-api-1  | cd ggllm.cpp && patch -p1 < ../patches/1902-cuda.patch
localai-api-1  | patching file examples/falcon_common.cpp
localai-api-1  | patching file libfalcon.cpp
localai-api-1  | patching file libfalcon.h
localai-api-1  | touch prepare
localai-api-1  | mkdir -p build
localai-api-1  | cd build && cmake ../ggllm.cpp -DLLAMA_F16C=OFF -DLLAMA_AVX512=OFF -DLLAMA_AVX2=OFF -DLLAMA_AVX=OFF -DLLAMA_FMA=OFF && VERBOSE=1 cmake --build . --config Release && cp -rf CMakeFiles/ggml.dir/ggml.c.o ../ggllm.cpp/ggml.o
localai-api-1  | -- The C compiler identification is GNU 10.2.1
localai-api-1  | -- The CXX compiler identification is GNU 10.2.1
localai-api-1  | -- Detecting C compiler ABI info
localai-api-1  | -- Detecting C compiler ABI info - done
localai-api-1  | -- Check for working C compiler: /usr/bin/cc - skipped
localai-api-1  | -- Detecting C compile features
localai-api-1  | -- Detecting C compile features - done
localai-api-1  | -- Detecting CXX compiler ABI info
localai-api-1  | -- Detecting CXX compiler ABI info - done
localai-api-1  | -- Check for working CXX compiler: /usr/bin/c++ - skipped
localai-api-1  | -- Detecting CXX compile features
localai-api-1  | -- Detecting CXX compile features - done
localai-api-1  | -- Found Git: /usr/bin/git (found version "2.30.2") 
localai-api-1  | -- Performing Test CMAKE_HAVE_LIBC_PTHREAD
localai-api-1  | -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
localai-api-1  | -- Check if compiler accepts -pthread
localai-api-1  | -- Check if compiler accepts -pthread - yes
localai-api-1  | -- Found Threads: TRUE  
localai-api-1  | -- Could not find nvcc, please set CUDAToolkit_ROOT.
localai-api-1  | CMake Warning at CMakeLists.txt:260 (message):
localai-api-1  |   cuBLAS not found
localai-api-1  | 
localai-api-1  | 
localai-api-1  | -- CMAKE_SYSTEM_PROCESSOR: x86_64
localai-api-1  | -- x86 detected
localai-api-1  | -- Configuring done (0.9s)
localai-api-1  | -- Generating done (0.1s)
localai-api-1  | -- Build files have been written to: /build/go-ggllm/build
localai-api-1  | Change Dir: '/build/go-ggllm/build'
localai-api-1  | 
localai-api-1  | Run Build Command(s): /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E env VERBOSE=1 /usr/bin/gmake -f Makefile
localai-api-1  | gmake[2]: Entering directory '/build/go-ggllm/build'
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -S/build/go-ggllm/ggllm.cpp -B/build/go-ggllm/build --check-build-system CMakeFiles/Makefile.cmake 0
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_progress_start /build/go-ggllm/build/CMakeFiles /build/go-ggllm/build//CMakeFiles/progress.marks
localai-api-1  | /usr/bin/gmake  -f CMakeFiles/Makefile2 all
localai-api-1  | gmake[3]: Entering directory '/build/go-ggllm/build'
localai-api-1  | /usr/bin/gmake  -f CMakeFiles/BUILD_INFO.dir/build.make CMakeFiles/BUILD_INFO.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | cd /build/go-ggllm/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-ggllm/ggllm.cpp /build/go-ggllm/ggllm.cpp /build/go-ggllm/build /build/go-ggllm/build /build/go-ggllm/build/CMakeFiles/BUILD_INFO.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | /usr/bin/gmake  -f CMakeFiles/BUILD_INFO.dir/build.make CMakeFiles/BUILD_INFO.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | gmake[4]: Nothing to be done for 'CMakeFiles/BUILD_INFO.dir/build'.
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | [  2%] Built target BUILD_INFO
localai-api-1  | /usr/bin/gmake  -f CMakeFiles/ggml.dir/build.make CMakeFiles/ggml.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | cd /build/go-ggllm/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-ggllm/ggllm.cpp /build/go-ggllm/ggllm.cpp /build/go-ggllm/build /build/go-ggllm/build /build/go-ggllm/build/CMakeFiles/ggml.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | /usr/bin/gmake  -f CMakeFiles/ggml.dir/build.make CMakeFiles/ggml.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | [  4%] Building C object CMakeFiles/ggml.dir/ggml.c.o
localai-api-1  | /usr/bin/cc -DGGML_PERF=1 -DGGML_USE_K_QUANTS -I/usr/local/cuda/include -I/build/go-ggllm/ggllm.cpp/. -O3 -DNDEBUG -std=gnu11 -Wall -Wextra -Wpedantic -Wcast-qual -Wdouble-promotion -Wshadow -Wstrict-prototypes -Wpointer-arith -pthread -MD -MT CMakeFiles/ggml.dir/ggml.c.o -MF CMakeFiles/ggml.dir/ggml.c.o.d -o CMakeFiles/ggml.dir/ggml.c.o -c /build/go-ggllm/ggllm.cpp/ggml.c
localai-api-1  | /build/go-ggllm/ggllm.cpp/ggml.c: In function ‘quantize_row_q8_0’:
localai-api-1  | /build/go-ggllm/ggllm.cpp/ggml.c:1134:15: warning: unused variable ‘nb’ [-Wunused-variable]
localai-api-1  |  1134 |     const int nb = k / QK8_0;
localai-api-1  |       |               ^~
localai-api-1  | /build/go-ggllm/ggllm.cpp/ggml.c: In function ‘quantize_row_q8_1’:
localai-api-1  | /build/go-ggllm/ggllm.cpp/ggml.c:1329:15: warning: unused variable ‘nb’ [-Wunused-variable]
localai-api-1  |  1329 |     const int nb = k / QK8_1;
localai-api-1  |       |               ^~
localai-api-1  | /build/go-ggllm/ggllm.cpp/ggml.c: In function ‘ggml_compute_forward_mul_mat_f32’:
localai-api-1  | /build/go-ggllm/ggllm.cpp/ggml.c:10924:19: warning: unused variable ‘ne10’ [-Wunused-variable]
localai-api-1  | 10924 |     const int64_t ne10 = src1->ne[0];
localai-api-1  |       |                   ^~~~
localai-api-1  | [  6%] Building C object CMakeFiles/ggml.dir/k_quants.c.o
localai-api-1  | /usr/bin/cc -DGGML_PERF=1 -DGGML_USE_K_QUANTS -I/usr/local/cuda/include -I/build/go-ggllm/ggllm.cpp/. -O3 -DNDEBUG -std=gnu11 -Wall -Wextra -Wpedantic -Wcast-qual -Wdouble-promotion -Wshadow -Wstrict-prototypes -Wpointer-arith -pthread -MD -MT CMakeFiles/ggml.dir/k_quants.c.o -MF CMakeFiles/ggml.dir/k_quants.c.o.d -o CMakeFiles/ggml.dir/k_quants.c.o -c /build/go-ggllm/ggllm.cpp/k_quants.c
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | [  6%] Built target ggml
localai-api-1  | /usr/bin/gmake  -f CMakeFiles/ggml_static.dir/build.make CMakeFiles/ggml_static.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | cd /build/go-ggllm/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-ggllm/ggllm.cpp /build/go-ggllm/ggllm.cpp /build/go-ggllm/build /build/go-ggllm/build /build/go-ggllm/build/CMakeFiles/ggml_static.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | /usr/bin/gmake  -f CMakeFiles/ggml_static.dir/build.make CMakeFiles/ggml_static.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | [  8%] Linking C static library libggml_static.a
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -P CMakeFiles/ggml_static.dir/cmake_clean_target.cmake
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/ggml_static.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/ar qc libggml_static.a CMakeFiles/ggml.dir/ggml.c.o CMakeFiles/ggml.dir/k_quants.c.o
localai-api-1  | /usr/bin/ranlib libggml_static.a
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | [  8%] Built target ggml_static
localai-api-1  | /usr/bin/gmake  -f CMakeFiles/llama.dir/build.make CMakeFiles/llama.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | cd /build/go-ggllm/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-ggllm/ggllm.cpp /build/go-ggllm/ggllm.cpp /build/go-ggllm/build /build/go-ggllm/build /build/go-ggllm/build/CMakeFiles/llama.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | /usr/bin/gmake  -f CMakeFiles/llama.dir/build.make CMakeFiles/llama.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | [ 10%] Building CXX object CMakeFiles/llama.dir/llama.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_PERF=1 -DGGML_USE_K_QUANTS -I/usr/local/cuda/include -I/build/go-ggllm/ggllm.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -MD -MT CMakeFiles/llama.dir/llama.cpp.o -MF CMakeFiles/llama.dir/llama.cpp.o.d -o CMakeFiles/llama.dir/llama.cpp.o -c /build/go-ggllm/ggllm.cpp/llama.cpp
localai-api-1  | [ 12%] Linking CXX static library libllama.a
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -P CMakeFiles/llama.dir/cmake_clean_target.cmake
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/llama.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/ar qc libllama.a CMakeFiles/llama.dir/llama.cpp.o CMakeFiles/ggml.dir/ggml.c.o CMakeFiles/ggml.dir/k_quants.c.o
localai-api-1  | /usr/bin/ranlib libllama.a
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | [ 12%] Built target llama
localai-api-1  | /usr/bin/gmake  -f CMakeFiles/cmpnct_unicode.dir/build.make CMakeFiles/cmpnct_unicode.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | cd /build/go-ggllm/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-ggllm/ggllm.cpp /build/go-ggllm/ggllm.cpp /build/go-ggllm/build /build/go-ggllm/build /build/go-ggllm/build/CMakeFiles/cmpnct_unicode.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | /usr/bin/gmake  -f CMakeFiles/cmpnct_unicode.dir/build.make CMakeFiles/cmpnct_unicode.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | [ 14%] Building CXX object CMakeFiles/cmpnct_unicode.dir/cmpnct_unicode.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_PERF=1 -DGGML_USE_K_QUANTS -I/usr/local/cuda/include -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/cmpnct_unicode.dir/cmpnct_unicode.cpp.o -MF CMakeFiles/cmpnct_unicode.dir/cmpnct_unicode.cpp.o.d -o CMakeFiles/cmpnct_unicode.dir/cmpnct_unicode.cpp.o -c /build/go-ggllm/ggllm.cpp/cmpnct_unicode.cpp
localai-api-1  | [ 16%] Linking CXX static library libcmpnct_unicode.a
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -P CMakeFiles/cmpnct_unicode.dir/cmake_clean_target.cmake
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/cmpnct_unicode.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/ar qc libcmpnct_unicode.a CMakeFiles/cmpnct_unicode.dir/cmpnct_unicode.cpp.o
localai-api-1  | /usr/bin/ranlib libcmpnct_unicode.a
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | [ 16%] Built target cmpnct_unicode
localai-api-1  | /usr/bin/gmake  -f CMakeFiles/falcon.dir/build.make CMakeFiles/falcon.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | cd /build/go-ggllm/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-ggllm/ggllm.cpp /build/go-ggllm/ggllm.cpp /build/go-ggllm/build /build/go-ggllm/build /build/go-ggllm/build/CMakeFiles/falcon.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | /usr/bin/gmake  -f CMakeFiles/falcon.dir/build.make CMakeFiles/falcon.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | [ 18%] Building CXX object CMakeFiles/falcon.dir/libfalcon.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_PERF=1 -DGGML_USE_K_QUANTS -I/usr/local/cuda/include -I/build/go-ggllm/ggllm.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -MD -MT CMakeFiles/falcon.dir/libfalcon.cpp.o -MF CMakeFiles/falcon.dir/libfalcon.cpp.o.d -o CMakeFiles/falcon.dir/libfalcon.cpp.o -c /build/go-ggllm/ggllm.cpp/libfalcon.cpp
localai-api-1  | /build/go-ggllm/ggllm.cpp/libfalcon.cpp: In function ‘bool kv_cache_init(const falcon_hparams&, falcon_kv_cache&, ggml_type, int, int)’:
localai-api-1  | /build/go-ggllm/ggllm.cpp/libfalcon.cpp:1317:19: warning: unused variable ‘n_layer’ [-Wunused-variable]
localai-api-1  |  1317 |     const int64_t n_layer = hparams.n_layer;
localai-api-1  |       |                   ^~~~~~~
localai-api-1  | /build/go-ggllm/ggllm.cpp/libfalcon.cpp: In function ‘void falcon_model_load_internal(const string&, falcon_context&, int, int, int, int, ggml_type, bool, bool, bool, falcon_progress_callback, void*)’:
localai-api-1  | /build/go-ggllm/ggllm.cpp/libfalcon.cpp:1624:9: warning: unused variable ‘vram_reserved’ [-Wunused-variable]
localai-api-1  |  1624 |     int vram_reserved=128*MB;    // that amount of VRAM is to stay free on GPU (needs to become a user parameter)
localai-api-1  |       |         ^~~~~~~~~~~~~
localai-api-1  | /build/go-ggllm/ggllm.cpp/libfalcon.cpp:1625:12: warning: unused variable ‘vram_overhead’ [-Wunused-variable]
localai-api-1  |  1625 |     size_t vram_overhead = 32*MB;    // this amount of vram is estimated for non weight storage buffers on VRAM
localai-api-1  |       |            ^~~~~~~~~~~~~
localai-api-1  | /build/go-ggllm/ggllm.cpp/libfalcon.cpp: In function ‘bool falcon_eval_internal(falcon_context&, const falcon_token*, int, int, int, const char*, int)’:
localai-api-1  | /build/go-ggllm/ggllm.cpp/libfalcon.cpp:2068:15: warning: unused variable ‘i_gpu_last’ [-Wunused-variable]
localai-api-1  |  2068 |     const int i_gpu_last = lctx.model.i_gpu_last > 0 ? lctx.model.i_gpu_last : n_layer;
localai-api-1  |       |               ^~~~~~~~~~
localai-api-1  | /build/go-ggllm/ggllm.cpp/libfalcon.cpp:2076:20: warning: unused variable ‘offload_func_nr’ [-Wunused-variable]
localai-api-1  |  2076 |     offload_func_t offload_func_nr = llama_nop; // nr = non-repeating
localai-api-1  |       |                    ^~~~~~~~~~~~~~~
localai-api-1  | /build/go-ggllm/ggllm.cpp/libfalcon.cpp:2077:20: warning: unused variable ‘offload_func_kqv’ [-Wunused-variable]
localai-api-1  |  2077 |     offload_func_t offload_func_kqv = llama_nop;
localai-api-1  |       |                    ^~~~~~~~~~~~~~~~
localai-api-1  | /build/go-ggllm/ggllm.cpp/libfalcon.cpp:2316:20: warning: unused variable ‘offload_func’ [-Wunused-variable]
localai-api-1  |  2316 |     offload_func_t offload_func = llama_nop;
localai-api-1  |       |                    ^~~~~~~~~~~~
localai-api-1  | /build/go-ggllm/ggllm.cpp/libfalcon.cpp: In function ‘size_t falcon_copy_state_data(falcon_context*, uint8_t*)’:
localai-api-1  | /build/go-ggllm/ggllm.cpp/libfalcon.cpp:4114:22: warning: unused variable ‘n_embd’ [-Wunused-variable]
localai-api-1  |  4114 |         const int    n_embd  = hparams.n_embd;
localai-api-1  |       |                      ^~~~~~
localai-api-1  | /build/go-ggllm/ggllm.cpp/libfalcon.cpp: In function ‘size_t falcon_set_state_data(falcon_context*, uint8_t*)’:
localai-api-1  | /build/go-ggllm/ggllm.cpp/libfalcon.cpp:4230:22: warning: unused variable ‘n_embd’ [-Wunused-variable]
localai-api-1  |  4230 |         const int    n_embd  = hparams.n_embd;
localai-api-1  |       |                      ^~~~~~
localai-api-1  | [ 20%] Linking CXX static library libfalcon.a
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -P CMakeFiles/falcon.dir/cmake_clean_target.cmake
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/falcon.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/ar qc libfalcon.a CMakeFiles/falcon.dir/libfalcon.cpp.o CMakeFiles/ggml.dir/ggml.c.o CMakeFiles/ggml.dir/k_quants.c.o
localai-api-1  | /usr/bin/ranlib libfalcon.a
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | [ 20%] Built target falcon
localai-api-1  | /usr/bin/gmake  -f tests/CMakeFiles/test-quantize-fns.dir/build.make tests/CMakeFiles/test-quantize-fns.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | cd /build/go-ggllm/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-ggllm/ggllm.cpp /build/go-ggllm/ggllm.cpp/tests /build/go-ggllm/build /build/go-ggllm/build/tests /build/go-ggllm/build/tests/CMakeFiles/test-quantize-fns.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | /usr/bin/gmake  -f tests/CMakeFiles/test-quantize-fns.dir/build.make tests/CMakeFiles/test-quantize-fns.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | [ 22%] Building CXX object tests/CMakeFiles/test-quantize-fns.dir/test-quantize-fns.cpp.o
localai-api-1  | cd /build/go-ggllm/build/tests && /usr/bin/c++ -DGGML_PERF=1 -DGGML_USE_K_QUANTS -I/usr/local/cuda/include -I/build/go-ggllm/ggllm.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT tests/CMakeFiles/test-quantize-fns.dir/test-quantize-fns.cpp.o -MF CMakeFiles/test-quantize-fns.dir/test-quantize-fns.cpp.o.d -o CMakeFiles/test-quantize-fns.dir/test-quantize-fns.cpp.o -c /build/go-ggllm/ggllm.cpp/tests/test-quantize-fns.cpp
localai-api-1  | [ 25%] Linking CXX executable ../bin/test-quantize-fns
localai-api-1  | cd /build/go-ggllm/build/tests && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/test-quantize-fns.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG "CMakeFiles/test-quantize-fns.dir/test-quantize-fns.cpp.o" -o ../bin/test-quantize-fns  ../libllama.a -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | [ 25%] Built target test-quantize-fns
localai-api-1  | /usr/bin/gmake  -f tests/CMakeFiles/test-quantize-perf.dir/build.make tests/CMakeFiles/test-quantize-perf.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | cd /build/go-ggllm/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-ggllm/ggllm.cpp /build/go-ggllm/ggllm.cpp/tests /build/go-ggllm/build /build/go-ggllm/build/tests /build/go-ggllm/build/tests/CMakeFiles/test-quantize-perf.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | /usr/bin/gmake  -f tests/CMakeFiles/test-quantize-perf.dir/build.make tests/CMakeFiles/test-quantize-perf.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | [ 27%] Building CXX object tests/CMakeFiles/test-quantize-perf.dir/test-quantize-perf.cpp.o
localai-api-1  | cd /build/go-ggllm/build/tests && /usr/bin/c++ -DGGML_PERF=1 -DGGML_USE_K_QUANTS -I/usr/local/cuda/include -I/build/go-ggllm/ggllm.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT tests/CMakeFiles/test-quantize-perf.dir/test-quantize-perf.cpp.o -MF CMakeFiles/test-quantize-perf.dir/test-quantize-perf.cpp.o.d -o CMakeFiles/test-quantize-perf.dir/test-quantize-perf.cpp.o -c /build/go-ggllm/ggllm.cpp/tests/test-quantize-perf.cpp
localai-api-1  | [ 29%] Linking CXX executable ../bin/test-quantize-perf
localai-api-1  | cd /build/go-ggllm/build/tests && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/test-quantize-perf.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG "CMakeFiles/test-quantize-perf.dir/test-quantize-perf.cpp.o" -o ../bin/test-quantize-perf  ../libllama.a -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | [ 29%] Built target test-quantize-perf
localai-api-1  | /usr/bin/gmake  -f tests/CMakeFiles/test-sampling.dir/build.make tests/CMakeFiles/test-sampling.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | cd /build/go-ggllm/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-ggllm/ggllm.cpp /build/go-ggllm/ggllm.cpp/tests /build/go-ggllm/build /build/go-ggllm/build/tests /build/go-ggllm/build/tests/CMakeFiles/test-sampling.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | /usr/bin/gmake  -f tests/CMakeFiles/test-sampling.dir/build.make tests/CMakeFiles/test-sampling.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | [ 31%] Building CXX object tests/CMakeFiles/test-sampling.dir/test-sampling.cpp.o
localai-api-1  | cd /build/go-ggllm/build/tests && /usr/bin/c++ -DGGML_PERF=1 -DGGML_USE_K_QUANTS -I/usr/local/cuda/include -I/build/go-ggllm/ggllm.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT tests/CMakeFiles/test-sampling.dir/test-sampling.cpp.o -MF CMakeFiles/test-sampling.dir/test-sampling.cpp.o.d -o CMakeFiles/test-sampling.dir/test-sampling.cpp.o -c /build/go-ggllm/ggllm.cpp/tests/test-sampling.cpp
localai-api-1  | In file included from /build/go-ggllm/ggllm.cpp/tests/test-sampling.cpp:2:
localai-api-1  | /build/go-ggllm/ggllm.cpp/./libfalcon.h:252:24: warning: ‘FINETUNE_NAME’ defined but not used [-Wunused-variable]
localai-api-1  |   252 |     static const char *FINETUNE_NAME[6] = { "UNSPECIFIED", "NONE", "ALPACA", "OPENASSISTANT", "WIZARD", "FALCONINSTRUCT" };
localai-api-1  |       |                        ^~~~~~~~~~~~~
localai-api-1  | [ 33%] Linking CXX executable ../bin/test-sampling
localai-api-1  | cd /build/go-ggllm/build/tests && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/test-sampling.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG "CMakeFiles/test-sampling.dir/test-sampling.cpp.o" -o ../bin/test-sampling  ../libllama.a -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | [ 33%] Built target test-sampling
localai-api-1  | /usr/bin/gmake  -f examples/CMakeFiles/common.dir/build.make examples/CMakeFiles/common.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | cd /build/go-ggllm/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-ggllm/ggllm.cpp /build/go-ggllm/ggllm.cpp/examples /build/go-ggllm/build /build/go-ggllm/build/examples /build/go-ggllm/build/examples/CMakeFiles/common.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | /usr/bin/gmake  -f examples/CMakeFiles/common.dir/build.make examples/CMakeFiles/common.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | [ 35%] Building CXX object examples/CMakeFiles/common.dir/common.cpp.o
localai-api-1  | cd /build/go-ggllm/build/examples && /usr/bin/c++ -DGGML_PERF=1 -DGGML_USE_K_QUANTS -I/usr/local/cuda/include -I/build/go-ggllm/ggllm.cpp/examples/. -I/build/go-ggllm/ggllm.cpp/examples -I/build/go-ggllm/ggllm.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT examples/CMakeFiles/common.dir/common.cpp.o -MF CMakeFiles/common.dir/common.cpp.o.d -o CMakeFiles/common.dir/common.cpp.o -c /build/go-ggllm/ggllm.cpp/examples/common.cpp
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | [ 35%] Built target common
localai-api-1  | /usr/bin/gmake  -f examples/CMakeFiles/falcon_common.dir/build.make examples/CMakeFiles/falcon_common.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | cd /build/go-ggllm/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-ggllm/ggllm.cpp /build/go-ggllm/ggllm.cpp/examples /build/go-ggllm/build /build/go-ggllm/build/examples /build/go-ggllm/build/examples/CMakeFiles/falcon_common.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | /usr/bin/gmake  -f examples/CMakeFiles/falcon_common.dir/build.make examples/CMakeFiles/falcon_common.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | [ 37%] Building CXX object examples/CMakeFiles/falcon_common.dir/falcon_common.cpp.o
localai-api-1  | cd /build/go-ggllm/build/examples && /usr/bin/c++ -DGGML_PERF=1 -DGGML_USE_K_QUANTS -I/usr/local/cuda/include -I/build/go-ggllm/ggllm.cpp/examples/. -I/build/go-ggllm/ggllm.cpp/examples -I/build/go-ggllm/ggllm.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT examples/CMakeFiles/falcon_common.dir/falcon_common.cpp.o -MF CMakeFiles/falcon_common.dir/falcon_common.cpp.o.d -o CMakeFiles/falcon_common.dir/falcon_common.cpp.o -c /build/go-ggllm/ggllm.cpp/examples/falcon_common.cpp
localai-api-1  | In file included from /build/go-ggllm/ggllm.cpp/examples/falcon_common.h:6,
localai-api-1  |                  from /build/go-ggllm/ggllm.cpp/examples/falcon_common.cpp:1:
localai-api-1  | /build/go-ggllm/ggllm.cpp/./libfalcon.h:252:24: warning: ‘FINETUNE_NAME’ defined but not used [-Wunused-variable]
localai-api-1  |   252 |     static const char *FINETUNE_NAME[6] = { "UNSPECIFIED", "NONE", "ALPACA", "OPENASSISTANT", "WIZARD", "FALCONINSTRUCT" };
localai-api-1  |       |                        ^~~~~~~~~~~~~
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | [ 37%] Built target falcon_common
localai-api-1  | /usr/bin/gmake  -f examples/main/CMakeFiles/main.dir/build.make examples/main/CMakeFiles/main.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | cd /build/go-ggllm/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-ggllm/ggllm.cpp /build/go-ggllm/ggllm.cpp/examples/main /build/go-ggllm/build /build/go-ggllm/build/examples/main /build/go-ggllm/build/examples/main/CMakeFiles/main.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | /usr/bin/gmake  -f examples/main/CMakeFiles/main.dir/build.make examples/main/CMakeFiles/main.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | [ 39%] Building CXX object examples/main/CMakeFiles/main.dir/main.cpp.o
localai-api-1  | cd /build/go-ggllm/build/examples/main && /usr/bin/c++ -DGGML_PERF=1 -DGGML_USE_K_QUANTS -I/usr/local/cuda/include -I/build/go-ggllm/ggllm.cpp/examples -I/build/go-ggllm/ggllm.cpp/examples/. -I/build/go-ggllm/ggllm.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT examples/main/CMakeFiles/main.dir/main.cpp.o -MF CMakeFiles/main.dir/main.cpp.o.d -o CMakeFiles/main.dir/main.cpp.o -c /build/go-ggllm/ggllm.cpp/examples/main/main.cpp
localai-api-1  | [ 41%] Linking CXX executable ../../bin/main
localai-api-1  | cd /build/go-ggllm/build/examples/main && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/main.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG CMakeFiles/main.dir/main.cpp.o ../CMakeFiles/common.dir/common.cpp.o -o ../../bin/main  ../../libllama.a -pthread -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | [ 41%] Built target main
localai-api-1  | /usr/bin/gmake  -f examples/falcon/CMakeFiles/falcon_main.dir/build.make examples/falcon/CMakeFiles/falcon_main.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | cd /build/go-ggllm/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-ggllm/ggllm.cpp /build/go-ggllm/ggllm.cpp/examples/falcon /build/go-ggllm/build /build/go-ggllm/build/examples/falcon /build/go-ggllm/build/examples/falcon/CMakeFiles/falcon_main.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | /usr/bin/gmake  -f examples/falcon/CMakeFiles/falcon_main.dir/build.make examples/falcon/CMakeFiles/falcon_main.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | [ 43%] Building CXX object examples/falcon/CMakeFiles/falcon_main.dir/falcon_main.cpp.o
localai-api-1  | cd /build/go-ggllm/build/examples/falcon && /usr/bin/c++ -DGGML_PERF=1 -DGGML_USE_K_QUANTS -I/usr/local/cuda/include -I/build/go-ggllm/ggllm.cpp/examples -I/build/go-ggllm/ggllm.cpp/examples/. -I/build/go-ggllm/ggllm.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT examples/falcon/CMakeFiles/falcon_main.dir/falcon_main.cpp.o -MF CMakeFiles/falcon_main.dir/falcon_main.cpp.o.d -o CMakeFiles/falcon_main.dir/falcon_main.cpp.o -c /build/go-ggllm/ggllm.cpp/examples/falcon/falcon_main.cpp
localai-api-1  | /build/go-ggllm/ggllm.cpp/examples/falcon/falcon_main.cpp: In function ‘int main(int, char**)’:
localai-api-1  | /build/go-ggllm/ggllm.cpp/examples/falcon/falcon_main.cpp:963:27: warning: suggest parentheses around ‘&&’ within ‘||’ [-Wparentheses]
localai-api-1  |   963 |         if (!embd.empty() && embd.back() == falcon_token_eos() || stopword_fulfilled)
localai-api-1  |       |             ~~~~~~~~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
localai-api-1  | /build/go-ggllm/ggllm.cpp/examples/falcon/falcon_main.cpp:433:10: warning: unused variable ‘falcon_token_newline’ [-Wunused-variable]
localai-api-1  |   433 |     auto falcon_token_newline = falcon_token_nl();
localai-api-1  |       |          ^~~~~~~~~~~~~~~~~~~~
localai-api-1  | [ 45%] Linking CXX executable ../../bin/falcon_main
localai-api-1  | cd /build/go-ggllm/build/examples/falcon && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/falcon_main.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG CMakeFiles/falcon_main.dir/falcon_main.cpp.o ../CMakeFiles/falcon_common.dir/falcon_common.cpp.o -o ../../bin/falcon_main  ../../libfalcon.a -pthread ../../libcmpnct_unicode.a -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | [ 45%] Built target falcon_main
localai-api-1  | /usr/bin/gmake  -f examples/falcon_quantize/CMakeFiles/falcon_quantize.dir/build.make examples/falcon_quantize/CMakeFiles/falcon_quantize.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | cd /build/go-ggllm/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-ggllm/ggllm.cpp /build/go-ggllm/ggllm.cpp/examples/falcon_quantize /build/go-ggllm/build /build/go-ggllm/build/examples/falcon_quantize /build/go-ggllm/build/examples/falcon_quantize/CMakeFiles/falcon_quantize.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | /usr/bin/gmake  -f examples/falcon_quantize/CMakeFiles/falcon_quantize.dir/build.make examples/falcon_quantize/CMakeFiles/falcon_quantize.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | [ 47%] Building CXX object examples/falcon_quantize/CMakeFiles/falcon_quantize.dir/quantize.cpp.o
localai-api-1  | cd /build/go-ggllm/build/examples/falcon_quantize && /usr/bin/c++ -DGGML_PERF=1 -DGGML_USE_K_QUANTS -I/usr/local/cuda/include -I/build/go-ggllm/ggllm.cpp/examples -I/build/go-ggllm/ggllm.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT examples/falcon_quantize/CMakeFiles/falcon_quantize.dir/quantize.cpp.o -MF CMakeFiles/falcon_quantize.dir/quantize.cpp.o.d -o CMakeFiles/falcon_quantize.dir/quantize.cpp.o -c /build/go-ggllm/ggllm.cpp/examples/falcon_quantize/quantize.cpp
localai-api-1  | In file included from /build/go-ggllm/ggllm.cpp/examples/falcon_quantize/quantize.cpp:3:
localai-api-1  | /build/go-ggllm/ggllm.cpp/./libfalcon.h:252:24: warning: ‘FINETUNE_NAME’ defined but not used [-Wunused-variable]
localai-api-1  |   252 |     static const char *FINETUNE_NAME[6] = { "UNSPECIFIED", "NONE", "ALPACA", "OPENASSISTANT", "WIZARD", "FALCONINSTRUCT" };
localai-api-1  |       |                        ^~~~~~~~~~~~~
localai-api-1  | [ 50%] Linking CXX executable ../../bin/falcon_quantize
localai-api-1  | cd /build/go-ggllm/build/examples/falcon_quantize && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/falcon_quantize.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG CMakeFiles/falcon_quantize.dir/quantize.cpp.o -o ../../bin/falcon_quantize  ../../libfalcon.a -pthread ../../libcmpnct_unicode.a -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | [ 50%] Built target falcon_quantize
localai-api-1  | /usr/bin/gmake  -f examples/quantize/CMakeFiles/quantize.dir/build.make examples/quantize/CMakeFiles/quantize.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | cd /build/go-ggllm/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-ggllm/ggllm.cpp /build/go-ggllm/ggllm.cpp/examples/quantize /build/go-ggllm/build /build/go-ggllm/build/examples/quantize /build/go-ggllm/build/examples/quantize/CMakeFiles/quantize.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | /usr/bin/gmake  -f examples/quantize/CMakeFiles/quantize.dir/build.make examples/quantize/CMakeFiles/quantize.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | [ 52%] Building CXX object examples/quantize/CMakeFiles/quantize.dir/quantize.cpp.o
localai-api-1  | cd /build/go-ggllm/build/examples/quantize && /usr/bin/c++ -DGGML_PERF=1 -DGGML_USE_K_QUANTS -I/usr/local/cuda/include -I/build/go-ggllm/ggllm.cpp/examples -I/build/go-ggllm/ggllm.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT examples/quantize/CMakeFiles/quantize.dir/quantize.cpp.o -MF CMakeFiles/quantize.dir/quantize.cpp.o.d -o CMakeFiles/quantize.dir/quantize.cpp.o -c /build/go-ggllm/ggllm.cpp/examples/quantize/quantize.cpp
localai-api-1  | [ 54%] Linking CXX executable ../../bin/quantize
localai-api-1  | cd /build/go-ggllm/build/examples/quantize && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/quantize.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG CMakeFiles/quantize.dir/quantize.cpp.o -o ../../bin/quantize  ../../libllama.a -pthread -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | [ 54%] Built target quantize
localai-api-1  | /usr/bin/gmake  -f examples/quantize-stats/CMakeFiles/quantize-stats.dir/build.make examples/quantize-stats/CMakeFiles/quantize-stats.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | cd /build/go-ggllm/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-ggllm/ggllm.cpp /build/go-ggllm/ggllm.cpp/examples/quantize-stats /build/go-ggllm/build /build/go-ggllm/build/examples/quantize-stats /build/go-ggllm/build/examples/quantize-stats/CMakeFiles/quantize-stats.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | /usr/bin/gmake  -f examples/quantize-stats/CMakeFiles/quantize-stats.dir/build.make examples/quantize-stats/CMakeFiles/quantize-stats.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | [ 56%] Building CXX object examples/quantize-stats/CMakeFiles/quantize-stats.dir/quantize-stats.cpp.o
localai-api-1  | cd /build/go-ggllm/build/examples/quantize-stats && /usr/bin/c++ -DGGML_PERF=1 -DGGML_USE_K_QUANTS -I/usr/local/cuda/include -I/build/go-ggllm/ggllm.cpp/examples -I/build/go-ggllm/ggllm.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT examples/quantize-stats/CMakeFiles/quantize-stats.dir/quantize-stats.cpp.o -MF CMakeFiles/quantize-stats.dir/quantize-stats.cpp.o.d -o CMakeFiles/quantize-stats.dir/quantize-stats.cpp.o -c /build/go-ggllm/ggllm.cpp/examples/quantize-stats/quantize-stats.cpp
localai-api-1  | [ 58%] Linking CXX executable ../../bin/quantize-stats
localai-api-1  | cd /build/go-ggllm/build/examples/quantize-stats && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/quantize-stats.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG "CMakeFiles/quantize-stats.dir/quantize-stats.cpp.o" -o ../../bin/quantize-stats  ../../libllama.a -pthread -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | [ 58%] Built target quantize-stats
localai-api-1  | /usr/bin/gmake  -f examples/perplexity/CMakeFiles/perplexity.dir/build.make examples/perplexity/CMakeFiles/perplexity.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | cd /build/go-ggllm/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-ggllm/ggllm.cpp /build/go-ggllm/ggllm.cpp/examples/perplexity /build/go-ggllm/build /build/go-ggllm/build/examples/perplexity /build/go-ggllm/build/examples/perplexity/CMakeFiles/perplexity.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | /usr/bin/gmake  -f examples/perplexity/CMakeFiles/perplexity.dir/build.make examples/perplexity/CMakeFiles/perplexity.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | [ 60%] Building CXX object examples/perplexity/CMakeFiles/perplexity.dir/perplexity.cpp.o
localai-api-1  | cd /build/go-ggllm/build/examples/perplexity && /usr/bin/c++ -DGGML_PERF=1 -DGGML_USE_K_QUANTS -I/usr/local/cuda/include -I/build/go-ggllm/ggllm.cpp/examples -I/build/go-ggllm/ggllm.cpp/examples/. -I/build/go-ggllm/ggllm.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT examples/perplexity/CMakeFiles/perplexity.dir/perplexity.cpp.o -MF CMakeFiles/perplexity.dir/perplexity.cpp.o.d -o CMakeFiles/perplexity.dir/perplexity.cpp.o -c /build/go-ggllm/ggllm.cpp/examples/perplexity/perplexity.cpp
localai-api-1  | [ 62%] Linking CXX executable ../../bin/perplexity
localai-api-1  | cd /build/go-ggllm/build/examples/perplexity && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/perplexity.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG CMakeFiles/perplexity.dir/perplexity.cpp.o ../CMakeFiles/common.dir/common.cpp.o -o ../../bin/perplexity  ../../libllama.a -pthread -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | [ 62%] Built target perplexity
localai-api-1  | /usr/bin/gmake  -f examples/falcon_perplexity/CMakeFiles/falcon_perplexity.dir/build.make examples/falcon_perplexity/CMakeFiles/falcon_perplexity.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | cd /build/go-ggllm/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-ggllm/ggllm.cpp /build/go-ggllm/ggllm.cpp/examples/falcon_perplexity /build/go-ggllm/build /build/go-ggllm/build/examples/falcon_perplexity /build/go-ggllm/build/examples/falcon_perplexity/CMakeFiles/falcon_perplexity.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | /usr/bin/gmake  -f examples/falcon_perplexity/CMakeFiles/falcon_perplexity.dir/build.make examples/falcon_perplexity/CMakeFiles/falcon_perplexity.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | [ 64%] Building CXX object examples/falcon_perplexity/CMakeFiles/falcon_perplexity.dir/falcon_perplexity.cpp.o
localai-api-1  | cd /build/go-ggllm/build/examples/falcon_perplexity && /usr/bin/c++ -DGGML_PERF=1 -DGGML_USE_K_QUANTS -I/usr/local/cuda/include -I/build/go-ggllm/ggllm.cpp/examples -I/build/go-ggllm/ggllm.cpp/examples/. -I/build/go-ggllm/ggllm.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT examples/falcon_perplexity/CMakeFiles/falcon_perplexity.dir/falcon_perplexity.cpp.o -MF CMakeFiles/falcon_perplexity.dir/falcon_perplexity.cpp.o.d -o CMakeFiles/falcon_perplexity.dir/falcon_perplexity.cpp.o -c /build/go-ggllm/ggllm.cpp/examples/falcon_perplexity/falcon_perplexity.cpp
localai-api-1  | In file included from /build/go-ggllm/ggllm.cpp/examples/falcon_common.h:6,
localai-api-1  |                  from /build/go-ggllm/ggllm.cpp/examples/falcon_perplexity/falcon_perplexity.cpp:1:
localai-api-1  | /build/go-ggllm/ggllm.cpp/./libfalcon.h:252:24: warning: ‘FINETUNE_NAME’ defined but not used [-Wunused-variable]
localai-api-1  |   252 |     static const char *FINETUNE_NAME[6] = { "UNSPECIFIED", "NONE", "ALPACA", "OPENASSISTANT", "WIZARD", "FALCONINSTRUCT" };
localai-api-1  |       |                        ^~~~~~~~~~~~~
localai-api-1  | [ 66%] Linking CXX executable ../../bin/falcon_perplexity
localai-api-1  | cd /build/go-ggllm/build/examples/falcon_perplexity && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/falcon_perplexity.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG CMakeFiles/falcon_perplexity.dir/falcon_perplexity.cpp.o ../CMakeFiles/falcon_common.dir/falcon_common.cpp.o -o ../../bin/falcon_perplexity  ../../libfalcon.a -pthread ../../libcmpnct_unicode.a -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | [ 66%] Built target falcon_perplexity
localai-api-1  | /usr/bin/gmake  -f examples/embedding/CMakeFiles/embedding.dir/build.make examples/embedding/CMakeFiles/embedding.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | cd /build/go-ggllm/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-ggllm/ggllm.cpp /build/go-ggllm/ggllm.cpp/examples/embedding /build/go-ggllm/build /build/go-ggllm/build/examples/embedding /build/go-ggllm/build/examples/embedding/CMakeFiles/embedding.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | /usr/bin/gmake  -f examples/embedding/CMakeFiles/embedding.dir/build.make examples/embedding/CMakeFiles/embedding.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | [ 68%] Building CXX object examples/embedding/CMakeFiles/embedding.dir/embedding.cpp.o
localai-api-1  | cd /build/go-ggllm/build/examples/embedding && /usr/bin/c++ -DGGML_PERF=1 -DGGML_USE_K_QUANTS -I/usr/local/cuda/include -I/build/go-ggllm/ggllm.cpp/examples -I/build/go-ggllm/ggllm.cpp/examples/. -I/build/go-ggllm/ggllm.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT examples/embedding/CMakeFiles/embedding.dir/embedding.cpp.o -MF CMakeFiles/embedding.dir/embedding.cpp.o.d -o CMakeFiles/embedding.dir/embedding.cpp.o -c /build/go-ggllm/ggllm.cpp/examples/embedding/embedding.cpp
localai-api-1  | [ 70%] Linking CXX executable ../../bin/embedding
localai-api-1  | cd /build/go-ggllm/build/examples/embedding && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/embedding.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG CMakeFiles/embedding.dir/embedding.cpp.o ../CMakeFiles/common.dir/common.cpp.o -o ../../bin/embedding  ../../libllama.a -pthread -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | [ 70%] Built target embedding
localai-api-1  | /usr/bin/gmake  -f examples/save-load-state/CMakeFiles/save-load-state.dir/build.make examples/save-load-state/CMakeFiles/save-load-state.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | cd /build/go-ggllm/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-ggllm/ggllm.cpp /build/go-ggllm/ggllm.cpp/examples/save-load-state /build/go-ggllm/build /build/go-ggllm/build/examples/save-load-state /build/go-ggllm/build/examples/save-load-state/CMakeFiles/save-load-state.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | /usr/bin/gmake  -f examples/save-load-state/CMakeFiles/save-load-state.dir/build.make examples/save-load-state/CMakeFiles/save-load-state.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | [ 72%] Building CXX object examples/save-load-state/CMakeFiles/save-load-state.dir/save-load-state.cpp.o
localai-api-1  | cd /build/go-ggllm/build/examples/save-load-state && /usr/bin/c++ -DGGML_PERF=1 -DGGML_USE_K_QUANTS -I/usr/local/cuda/include -I/build/go-ggllm/ggllm.cpp/examples -I/build/go-ggllm/ggllm.cpp/examples/. -I/build/go-ggllm/ggllm.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT examples/save-load-state/CMakeFiles/save-load-state.dir/save-load-state.cpp.o -MF CMakeFiles/save-load-state.dir/save-load-state.cpp.o.d -o CMakeFiles/save-load-state.dir/save-load-state.cpp.o -c /build/go-ggllm/ggllm.cpp/examples/save-load-state/save-load-state.cpp
localai-api-1  | [ 75%] Linking CXX executable ../../bin/save-load-state
localai-api-1  | cd /build/go-ggllm/build/examples/save-load-state && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/save-load-state.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG "CMakeFiles/save-load-state.dir/save-load-state.cpp.o" ../CMakeFiles/common.dir/common.cpp.o -o ../../bin/save-load-state  ../../libllama.a -pthread -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | [ 75%] Built target save-load-state
localai-api-1  | /usr/bin/gmake  -f examples/benchmark/CMakeFiles/benchmark.dir/build.make examples/benchmark/CMakeFiles/benchmark.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | cd /build/go-ggllm/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-ggllm/ggllm.cpp /build/go-ggllm/ggllm.cpp/examples/benchmark /build/go-ggllm/build /build/go-ggllm/build/examples/benchmark /build/go-ggllm/build/examples/benchmark/CMakeFiles/benchmark.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | /usr/bin/gmake  -f examples/benchmark/CMakeFiles/benchmark.dir/build.make examples/benchmark/CMakeFiles/benchmark.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | [ 77%] Building CXX object examples/benchmark/CMakeFiles/benchmark.dir/benchmark-matmult.cpp.o
localai-api-1  | cd /build/go-ggllm/build/examples/benchmark && /usr/bin/c++ -DGGML_PERF=1 -DGGML_USE_K_QUANTS -I/usr/local/cuda/include -I/build/go-ggllm/ggllm.cpp/examples -I/build/go-ggllm/ggllm.cpp/examples/. -I/build/go-ggllm/ggllm.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT examples/benchmark/CMakeFiles/benchmark.dir/benchmark-matmult.cpp.o -MF CMakeFiles/benchmark.dir/benchmark-matmult.cpp.o.d -o CMakeFiles/benchmark.dir/benchmark-matmult.cpp.o -c /build/go-ggllm/ggllm.cpp/examples/benchmark/benchmark-matmult.cpp
localai-api-1  | [ 79%] Linking CXX executable ../../bin/benchmark
localai-api-1  | cd /build/go-ggllm/build/examples/benchmark && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/benchmark.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG "CMakeFiles/benchmark.dir/benchmark-matmult.cpp.o" ../CMakeFiles/common.dir/common.cpp.o -o ../../bin/benchmark  ../../libllama.a -pthread -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | [ 79%] Built target benchmark
localai-api-1  | /usr/bin/gmake  -f examples/baby-llama/CMakeFiles/baby-llama.dir/build.make examples/baby-llama/CMakeFiles/baby-llama.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | cd /build/go-ggllm/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-ggllm/ggllm.cpp /build/go-ggllm/ggllm.cpp/examples/baby-llama /build/go-ggllm/build /build/go-ggllm/build/examples/baby-llama /build/go-ggllm/build/examples/baby-llama/CMakeFiles/baby-llama.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | /usr/bin/gmake  -f examples/baby-llama/CMakeFiles/baby-llama.dir/build.make examples/baby-llama/CMakeFiles/baby-llama.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | [ 81%] Building CXX object examples/baby-llama/CMakeFiles/baby-llama.dir/baby-llama.cpp.o
localai-api-1  | cd /build/go-ggllm/build/examples/baby-llama && /usr/bin/c++ -DGGML_PERF=1 -DGGML_USE_K_QUANTS -I/usr/local/cuda/include -I/build/go-ggllm/ggllm.cpp/examples -I/build/go-ggllm/ggllm.cpp/examples/. -I/build/go-ggllm/ggllm.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT examples/baby-llama/CMakeFiles/baby-llama.dir/baby-llama.cpp.o -MF CMakeFiles/baby-llama.dir/baby-llama.cpp.o.d -o CMakeFiles/baby-llama.dir/baby-llama.cpp.o -c /build/go-ggllm/ggllm.cpp/examples/baby-llama/baby-llama.cpp
localai-api-1  | /build/go-ggllm/ggllm.cpp/examples/baby-llama/baby-llama.cpp: In function ‘int main(int, char**)’:
localai-api-1  | /build/go-ggllm/ggllm.cpp/examples/baby-llama/baby-llama.cpp:1602:32: warning: variable ‘opt_params_adam’ set but not used [-Wunused-but-set-variable]
localai-api-1  |  1602 |         struct ggml_opt_params opt_params_adam = ggml_opt_default_params(GGML_OPT_ADAM);
localai-api-1  |       |                                ^~~~~~~~~~~~~~~
localai-api-1  | [ 83%] Linking CXX executable ../../bin/baby-llama
localai-api-1  | cd /build/go-ggllm/build/examples/baby-llama && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/baby-llama.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG "CMakeFiles/baby-llama.dir/baby-llama.cpp.o" ../CMakeFiles/common.dir/common.cpp.o -o ../../bin/baby-llama  ../../libllama.a -pthread -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | [ 83%] Built target baby-llama
localai-api-1  | /usr/bin/gmake  -f examples/train-text-from-scratch/CMakeFiles/train-text-from-scratch.dir/build.make examples/train-text-from-scratch/CMakeFiles/train-text-from-scratch.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | cd /build/go-ggllm/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-ggllm/ggllm.cpp /build/go-ggllm/ggllm.cpp/examples/train-text-from-scratch /build/go-ggllm/build /build/go-ggllm/build/examples/train-text-from-scratch /build/go-ggllm/build/examples/train-text-from-scratch/CMakeFiles/train-text-from-scratch.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | /usr/bin/gmake  -f examples/train-text-from-scratch/CMakeFiles/train-text-from-scratch.dir/build.make examples/train-text-from-scratch/CMakeFiles/train-text-from-scratch.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | [ 85%] Building CXX object examples/train-text-from-scratch/CMakeFiles/train-text-from-scratch.dir/train-text-from-scratch.cpp.o
localai-api-1  | cd /build/go-ggllm/build/examples/train-text-from-scratch && /usr/bin/c++ -DGGML_PERF=1 -DGGML_USE_K_QUANTS -I/usr/local/cuda/include -I/build/go-ggllm/ggllm.cpp/examples -I/build/go-ggllm/ggllm.cpp/examples/. -I/build/go-ggllm/ggllm.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT examples/train-text-from-scratch/CMakeFiles/train-text-from-scratch.dir/train-text-from-scratch.cpp.o -MF CMakeFiles/train-text-from-scratch.dir/train-text-from-scratch.cpp.o.d -o CMakeFiles/train-text-from-scratch.dir/train-text-from-scratch.cpp.o -c /build/go-ggllm/ggllm.cpp/examples/train-text-from-scratch/train-text-from-scratch.cpp
localai-api-1  | /build/go-ggllm/ggllm.cpp/examples/train-text-from-scratch/train-text-from-scratch.cpp: In function ‘void write_tensor(llama_file*, ggml_tensor*)’:
localai-api-1  | /build/go-ggllm/ggllm.cpp/examples/train-text-from-scratch/train-text-from-scratch.cpp:2371:21: warning: suggest parentheses around ‘-’ in operand of ‘&’ [-Wparentheses]
localai-api-1  |  2371 |         file->seek(0-file->tell() & 31, SEEK_CUR);
localai-api-1  |       |                    ~^~~~~~~~~~~~~
localai-api-1  | /build/go-ggllm/ggllm.cpp/examples/train-text-from-scratch/train-text-from-scratch.cpp:2386:17: warning: suggest parentheses around ‘-’ in operand of ‘&’ [-Wparentheses]
localai-api-1  |  2386 |     file->seek(0-file->tell() & 31, SEEK_CUR);
localai-api-1  |       |                ~^~~~~~~~~~~~~
localai-api-1  | /build/go-ggllm/ggllm.cpp/examples/train-text-from-scratch/train-text-from-scratch.cpp: In function ‘void read_tensor(llama_file*, ggml_tensor*)’:
localai-api-1  | /build/go-ggllm/ggllm.cpp/examples/train-text-from-scratch/train-text-from-scratch.cpp:2407:17: warning: suggest parentheses around ‘-’ in operand of ‘&’ [-Wparentheses]
localai-api-1  |  2407 |     file->seek(0-file->tell() & 31, SEEK_CUR);
localai-api-1  |       |                ~^~~~~~~~~~~~~
localai-api-1  | /build/go-ggllm/ggllm.cpp/examples/train-text-from-scratch/train-text-from-scratch.cpp: In function ‘void init_model(my_llama_model*)’:
localai-api-1  | /build/go-ggllm/ggllm.cpp/examples/train-text-from-scratch/train-text-from-scratch.cpp:305:16: warning: ‘char* strncpy(char*, const char*, size_t)’ specified bound 64 equals destination size [-Wstringop-truncation]
localai-api-1  |   305 |         strncpy(layer.w1->name, (layers_i + ".feed_forward.w1.weight").c_str(), sizeof(layer.w1->name));
localai-api-1  |       |         ~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
localai-api-1  | /build/go-ggllm/ggllm.cpp/examples/train-text-from-scratch/train-text-from-scratch.cpp:306:16: warning: ‘char* strncpy(char*, const char*, size_t)’ specified bound 64 equals destination size [-Wstringop-truncation]
localai-api-1  |   306 |         strncpy(layer.w2->name, (layers_i + ".feed_forward.w2.weight").c_str(), sizeof(layer.w2->name));
localai-api-1  |       |         ~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
localai-api-1  | /build/go-ggllm/ggllm.cpp/examples/train-text-from-scratch/train-text-from-scratch.cpp:307:16: warning: ‘char* strncpy(char*, const char*, size_t)’ specified bound 64 equals destination size [-Wstringop-truncation]
localai-api-1  |   307 |         strncpy(layer.w3->name, (layers_i + ".feed_forward.w3.weight").c_str(), sizeof(layer.w3->name));
localai-api-1  |       |         ~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
localai-api-1  | [ 87%] Linking CXX executable ../../bin/train-text-from-scratch
localai-api-1  | cd /build/go-ggllm/build/examples/train-text-from-scratch && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/train-text-from-scratch.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG "CMakeFiles/train-text-from-scratch.dir/train-text-from-scratch.cpp.o" ../CMakeFiles/common.dir/common.cpp.o -o ../../bin/train-text-from-scratch  ../../libllama.a -pthread -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | [ 87%] Built target train-text-from-scratch
localai-api-1  | /usr/bin/gmake  -f examples/simple/CMakeFiles/simple.dir/build.make examples/simple/CMakeFiles/simple.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | cd /build/go-ggllm/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-ggllm/ggllm.cpp /build/go-ggllm/ggllm.cpp/examples/simple /build/go-ggllm/build /build/go-ggllm/build/examples/simple /build/go-ggllm/build/examples/simple/CMakeFiles/simple.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | /usr/bin/gmake  -f examples/simple/CMakeFiles/simple.dir/build.make examples/simple/CMakeFiles/simple.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | [ 89%] Building CXX object examples/simple/CMakeFiles/simple.dir/simple.cpp.o
localai-api-1  | cd /build/go-ggllm/build/examples/simple && /usr/bin/c++ -DGGML_PERF=1 -DGGML_USE_K_QUANTS -I/usr/local/cuda/include -I/build/go-ggllm/ggllm.cpp/examples -I/build/go-ggllm/ggllm.cpp/examples/. -I/build/go-ggllm/ggllm.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT examples/simple/CMakeFiles/simple.dir/simple.cpp.o -MF CMakeFiles/simple.dir/simple.cpp.o.d -o CMakeFiles/simple.dir/simple.cpp.o -c /build/go-ggllm/ggllm.cpp/examples/simple/simple.cpp
localai-api-1  | [ 91%] Linking CXX executable ../../bin/simple
localai-api-1  | cd /build/go-ggllm/build/examples/simple && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/simple.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG CMakeFiles/simple.dir/simple.cpp.o ../CMakeFiles/common.dir/common.cpp.o -o ../../bin/simple  ../../libllama.a -pthread -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | [ 91%] Built target simple
localai-api-1  | /usr/bin/gmake  -f pocs/vdot/CMakeFiles/vdot.dir/build.make pocs/vdot/CMakeFiles/vdot.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | cd /build/go-ggllm/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-ggllm/ggllm.cpp /build/go-ggllm/ggllm.cpp/pocs/vdot /build/go-ggllm/build /build/go-ggllm/build/pocs/vdot /build/go-ggllm/build/pocs/vdot/CMakeFiles/vdot.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | /usr/bin/gmake  -f pocs/vdot/CMakeFiles/vdot.dir/build.make pocs/vdot/CMakeFiles/vdot.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | [ 93%] Building CXX object pocs/vdot/CMakeFiles/vdot.dir/vdot.cpp.o
localai-api-1  | cd /build/go-ggllm/build/pocs/vdot && /usr/bin/c++ -DGGML_PERF=1 -DGGML_USE_K_QUANTS -I/usr/local/cuda/include -I/build/go-ggllm/ggllm.cpp/pocs -I/build/go-ggllm/ggllm.cpp/examples/. -I/build/go-ggllm/ggllm.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT pocs/vdot/CMakeFiles/vdot.dir/vdot.cpp.o -MF CMakeFiles/vdot.dir/vdot.cpp.o.d -o CMakeFiles/vdot.dir/vdot.cpp.o -c /build/go-ggllm/ggllm.cpp/pocs/vdot/vdot.cpp
localai-api-1  | [ 95%] Linking CXX executable ../../bin/vdot
localai-api-1  | cd /build/go-ggllm/build/pocs/vdot && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/vdot.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG CMakeFiles/vdot.dir/vdot.cpp.o ../../examples/CMakeFiles/common.dir/common.cpp.o -o ../../bin/vdot  ../../libllama.a -pthread -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | [ 95%] Built target vdot
localai-api-1  | /usr/bin/gmake  -f pocs/vdot/CMakeFiles/q8dot.dir/build.make pocs/vdot/CMakeFiles/q8dot.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | cd /build/go-ggllm/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-ggllm/ggllm.cpp /build/go-ggllm/ggllm.cpp/pocs/vdot /build/go-ggllm/build /build/go-ggllm/build/pocs/vdot /build/go-ggllm/build/pocs/vdot/CMakeFiles/q8dot.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | /usr/bin/gmake  -f pocs/vdot/CMakeFiles/q8dot.dir/build.make pocs/vdot/CMakeFiles/q8dot.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-ggllm/build'
localai-api-1  | [ 97%] Building CXX object pocs/vdot/CMakeFiles/q8dot.dir/q8dot.cpp.o
localai-api-1  | cd /build/go-ggllm/build/pocs/vdot && /usr/bin/c++ -DGGML_PERF=1 -DGGML_USE_K_QUANTS -I/usr/local/cuda/include -I/build/go-ggllm/ggllm.cpp/pocs -I/build/go-ggllm/ggllm.cpp/examples/. -I/build/go-ggllm/ggllm.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT pocs/vdot/CMakeFiles/q8dot.dir/q8dot.cpp.o -MF CMakeFiles/q8dot.dir/q8dot.cpp.o.d -o CMakeFiles/q8dot.dir/q8dot.cpp.o -c /build/go-ggllm/ggllm.cpp/pocs/vdot/q8dot.cpp
localai-api-1  | [100%] Linking CXX executable ../../bin/q8dot
localai-api-1  | cd /build/go-ggllm/build/pocs/vdot && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/q8dot.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG CMakeFiles/q8dot.dir/q8dot.cpp.o ../../examples/CMakeFiles/common.dir/common.cpp.o -o ../../bin/q8dot  ../../libllama.a -pthread -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | [100%] Built target q8dot
localai-api-1  | gmake[3]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_progress_start /build/go-ggllm/build/CMakeFiles 0
localai-api-1  | gmake[2]: Leaving directory '/build/go-ggllm/build'
localai-api-1  | 
localai-api-1  | cd build && cp -rf CMakeFiles/cmpnct_unicode.dir/cmpnct_unicode.cpp.o ../ggllm.cpp/cmpnct_unicode.o
localai-api-1  | cd build && cp -rf CMakeFiles/llama.dir/llama.cpp.o ../ggllm.cpp/llama.o
localai-api-1  | cd build && cp -rf CMakeFiles/falcon.dir/libfalcon.cpp.o ../ggllm.cpp/libfalcon.o
localai-api-1  | cd build && cp -rf examples/CMakeFiles/falcon_common.dir/falcon_common.cpp.o ../ggllm.cpp/falcon_common.o
localai-api-1  | g++ -I./ggllm.cpp -I. -I./ggllm.cpp/examples -I./examples -O3 -DNDEBUG -std=c++11 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -pthread -I./ggllm.cpp -I./ggllm.cpp/examples falcon_binding.cpp -o falcon_binding.o -c 
localai-api-1  | falcon_binding.cpp: In function 'int falcon_predict(void*, void*, char*, bool)':
localai-api-1  | falcon_binding.cpp:468:48: warning: cast from type 'const char*' to type 'char*' casts away qualifiers [-Wcast-qual]
localai-api-1  |   468 |             if (!returntokenCallback(state_pr, (char*)token_str)) {
localai-api-1  |       |                                                ^~~~~~~~~~~~~~~~
localai-api-1  | falcon_binding.cpp:517:27: warning: suggest parentheses around '&&' within '||' [-Wparentheses]
localai-api-1  |   517 |         if (!embd.empty() && embd.back() == falcon_token_eos() || stopword_fulfilled)
localai-api-1  |       |             ~~~~~~~~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
localai-api-1  | falcon_binding.cpp:186:10: warning: unused variable 'falcon_token_newline' [-Wunused-variable]
localai-api-1  |   186 |     auto falcon_token_newline = falcon_token_nl();
localai-api-1  |       |          ^~~~~~~~~~~~~~~~~~~~
localai-api-1  | falcon_binding.cpp:231:10: warning: variable 'input_echo' set but not used [-Wunused-but-set-variable]
localai-api-1  |   231 |     bool input_echo           = true;
localai-api-1  |       |          ^~~~~~~~~~
localai-api-1  | cd build && cp -rf CMakeFiles/ggml.dir/k_quants.c.o ../ggllm.cpp/k_quants.o
localai-api-1  | ar src libggllm.a ggllm.cpp/libfalcon.o ggllm.cpp/cmpnct_unicode.o ggllm.cpp/ggml.o ggllm.cpp/k_quants.o  ggllm.cpp/falcon_common.o falcon_binding.o
localai-api-1  | make[1]: Leaving directory '/build/go-ggllm'
localai-api-1  | CGO_LDFLAGS="" C_INCLUDE_PATH=/build/go-ggllm LIBRARY_PATH=/build/go-ggllm \
localai-api-1  | go build -ldflags "-X "github.com/go-skynet/LocalAI/internal.Version=v1.23.2" -X "github.com/go-skynet/LocalAI/internal.Commit=acd829a7a0e1623c0871c8b34c36c76afd4feac8"" -tags "" -o backend-assets/grpc/falcon ./cmd/grpc/falcon/
localai-api-1  | cd bloomz && make libbloomz.a
localai-api-1  | make[1]: Entering directory '/build/bloomz'
localai-api-1  | I llama.cpp build info: 
localai-api-1  | I UNAME_S:  Linux
localai-api-1  | I UNAME_P:  unknown
localai-api-1  | I UNAME_M:  x86_64
localai-api-1  | I CFLAGS:   -I.              -O3 -DNDEBUG -std=c11   -fPIC -pthread -msse3
localai-api-1  | I CXXFLAGS: -I. -I./examples -O3 -DNDEBUG -std=c++11 -fPIC -pthread
localai-api-1  | I LDFLAGS:  
localai-api-1  | I CC:       cc (Debian 10.2.1-6) 10.2.1 20210110
localai-api-1  | I CXX:      g++ (Debian 10.2.1-6) 10.2.1 20210110
localai-api-1  | 
localai-api-1  | cc  -I.              -O3 -DNDEBUG -std=c11   -fPIC -pthread -msse3   -c ggml.c -o ggml.o
localai-api-1  | g++ -I. -I./examples -O3 -DNDEBUG -std=c++11 -fPIC -pthread -c utils.cpp -o utils.o
localai-api-1  | g++ -I. -I./examples -O3 -DNDEBUG -std=c++11 -fPIC -pthread bloomz.cpp ggml.o utils.o -o bloomz.o -c 
localai-api-1  | g++: warning: ggml.o: linker input file unused because linking not done
localai-api-1  | g++: warning: utils.o: linker input file unused because linking not done
localai-api-1  | ar src libbloomz.a bloomz.o ggml.o utils.o
localai-api-1  | make[1]: Leaving directory '/build/bloomz'
localai-api-1  | CGO_LDFLAGS="" C_INCLUDE_PATH=/build/bloomz LIBRARY_PATH=/build/bloomz \
localai-api-1  | go build -ldflags "-X "github.com/go-skynet/LocalAI/internal.Version=v1.23.2" -X "github.com/go-skynet/LocalAI/internal.Commit=acd829a7a0e1623c0871c8b34c36c76afd4feac8"" -tags "" -o backend-assets/grpc/bloomz ./cmd/grpc/bloomz/
localai-api-1  | make -C go-llama BUILD_TYPE= libbinding.a
localai-api-1  | make[1]: Entering directory '/build/go-llama'
localai-api-1  | I llama.cpp build info: 
localai-api-1  | I UNAME_S:  Linux
localai-api-1  | I UNAME_P:  unknown
localai-api-1  | I UNAME_M:  x86_64
localai-api-1  | I CFLAGS:   -I./llama.cpp -I. -O3 -DNDEBUG -std=c11 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wdouble-promotion -Wshadow -Wstrict-prototypes -Wpointer-arith -Wno-unused-function -pthread -march=native -mtune=native
localai-api-1  | I CXXFLAGS: -I./llama.cpp -I. -I./llama.cpp/examples -I./examples -O3 -DNDEBUG -std=c++11 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -pthread
localai-api-1  | I CGO_LDFLAGS:  
localai-api-1  | I LDFLAGS:  
localai-api-1  | I BUILD_TYPE:  
localai-api-1  | I CMAKE_ARGS:  -DLLAMA_F16C=OFF -DLLAMA_AVX512=OFF -DLLAMA_AVX2=OFF -DLLAMA_AVX=OFF -DLLAMA_FMA=OFF
localai-api-1  | I EXTRA_TARGETS:  
localai-api-1  | I CC:       cc (Debian 10.2.1-6) 10.2.1 20210110
localai-api-1  | I CXX:      g++ (Debian 10.2.1-6) 10.2.1 20210110
localai-api-1  | 
localai-api-1  | cd llama.cpp && patch -p1 < ../patches/1902-cuda.patch
localai-api-1  | patching file examples/common.cpp
localai-api-1  | patching file examples/common.h
localai-api-1  | touch prepare
localai-api-1  | mkdir -p build
localai-api-1  | cd build && cmake ../llama.cpp -DLLAMA_F16C=OFF -DLLAMA_AVX512=OFF -DLLAMA_AVX2=OFF -DLLAMA_AVX=OFF -DLLAMA_FMA=OFF && VERBOSE=1 cmake --build . --config Release && cp -rf CMakeFiles/ggml.dir/ggml.c.o ../llama.cpp/ggml.o
localai-api-1  | -- The C compiler identification is GNU 10.2.1
localai-api-1  | -- The CXX compiler identification is GNU 10.2.1
localai-api-1  | -- Detecting C compiler ABI info
localai-api-1  | -- Detecting C compiler ABI info - done
localai-api-1  | -- Check for working C compiler: /usr/bin/cc - skipped
localai-api-1  | -- Detecting C compile features
localai-api-1  | -- Detecting C compile features - done
localai-api-1  | -- Detecting CXX compiler ABI info
localai-api-1  | -- Detecting CXX compiler ABI info - done
localai-api-1  | -- Check for working CXX compiler: /usr/bin/c++ - skipped
localai-api-1  | -- Detecting CXX compile features
localai-api-1  | -- Detecting CXX compile features - done
localai-api-1  | -- Found Git: /usr/bin/git (found version "2.30.2") 
localai-api-1  | -- Performing Test CMAKE_HAVE_LIBC_PTHREAD
localai-api-1  | -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
localai-api-1  | -- Check if compiler accepts -pthread
localai-api-1  | -- Check if compiler accepts -pthread - yes
localai-api-1  | -- Found Threads: TRUE  
localai-api-1  | -- CMAKE_SYSTEM_PROCESSOR: x86_64
localai-api-1  | -- x86 detected
localai-api-1  | -- Configuring done (0.8s)
localai-api-1  | -- Generating done (0.1s)
localai-api-1  | -- Build files have been written to: /build/go-llama/build
localai-api-1  | Change Dir: '/build/go-llama/build'
localai-api-1  | 
localai-api-1  | Run Build Command(s): /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E env VERBOSE=1 /usr/bin/gmake -f Makefile
localai-api-1  | gmake[2]: Entering directory '/build/go-llama/build'
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -S/build/go-llama/llama.cpp -B/build/go-llama/build --check-build-system CMakeFiles/Makefile.cmake 0
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_progress_start /build/go-llama/build/CMakeFiles /build/go-llama/build//CMakeFiles/progress.marks
localai-api-1  | /usr/bin/gmake  -f CMakeFiles/Makefile2 all
localai-api-1  | gmake[3]: Entering directory '/build/go-llama/build'
localai-api-1  | /usr/bin/gmake  -f CMakeFiles/BUILD_INFO.dir/build.make CMakeFiles/BUILD_INFO.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | cd /build/go-llama/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-llama/llama.cpp /build/go-llama/llama.cpp /build/go-llama/build /build/go-llama/build /build/go-llama/build/CMakeFiles/BUILD_INFO.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | /usr/bin/gmake  -f CMakeFiles/BUILD_INFO.dir/build.make CMakeFiles/BUILD_INFO.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | gmake[4]: Nothing to be done for 'CMakeFiles/BUILD_INFO.dir/build'.
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | [  2%] Built target BUILD_INFO
localai-api-1  | /usr/bin/gmake  -f CMakeFiles/ggml.dir/build.make CMakeFiles/ggml.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | cd /build/go-llama/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-llama/llama.cpp /build/go-llama/llama.cpp /build/go-llama/build /build/go-llama/build /build/go-llama/build/CMakeFiles/ggml.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | /usr/bin/gmake  -f CMakeFiles/ggml.dir/build.make CMakeFiles/ggml.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | [  4%] Building C object CMakeFiles/ggml.dir/ggml.c.o
localai-api-1  | /usr/bin/cc -DGGML_USE_K_QUANTS -I/build/go-llama/llama.cpp/. -O3 -DNDEBUG -std=gnu11 -Wall -Wextra -Wpedantic -Wcast-qual -Wdouble-promotion -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -pthread -MD -MT CMakeFiles/ggml.dir/ggml.c.o -MF CMakeFiles/ggml.dir/ggml.c.o.d -o CMakeFiles/ggml.dir/ggml.c.o -c /build/go-llama/llama.cpp/ggml.c
localai-api-1  | /build/go-llama/llama.cpp/ggml.c: In function ‘quantize_row_q8_0’:
localai-api-1  | /build/go-llama/llama.cpp/ggml.c:1150:15: warning: unused variable ‘nb’ [-Wunused-variable]
localai-api-1  |  1150 |     const int nb = k / QK8_0;
localai-api-1  |       |               ^~
localai-api-1  | /build/go-llama/llama.cpp/ggml.c: In function ‘quantize_row_q8_1’:
localai-api-1  | /build/go-llama/llama.cpp/ggml.c:1345:15: warning: unused variable ‘nb’ [-Wunused-variable]
localai-api-1  |  1345 |     const int nb = k / QK8_1;
localai-api-1  |       |               ^~
localai-api-1  | [  6%] Building C object CMakeFiles/ggml.dir/ggml-alloc.c.o
localai-api-1  | /usr/bin/cc -DGGML_USE_K_QUANTS -I/build/go-llama/llama.cpp/. -O3 -DNDEBUG -std=gnu11 -Wall -Wextra -Wpedantic -Wcast-qual -Wdouble-promotion -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -pthread -MD -MT CMakeFiles/ggml.dir/ggml-alloc.c.o -MF CMakeFiles/ggml.dir/ggml-alloc.c.o.d -o CMakeFiles/ggml.dir/ggml-alloc.c.o -c /build/go-llama/llama.cpp/ggml-alloc.c
localai-api-1  | [  8%] Building C object CMakeFiles/ggml.dir/k_quants.c.o
localai-api-1  | /usr/bin/cc -DGGML_USE_K_QUANTS -I/build/go-llama/llama.cpp/. -O3 -DNDEBUG -std=gnu11 -Wall -Wextra -Wpedantic -Wcast-qual -Wdouble-promotion -Wshadow -Wstrict-prototypes -Wpointer-arith -Wmissing-prototypes -pthread -MD -MT CMakeFiles/ggml.dir/k_quants.c.o -MF CMakeFiles/ggml.dir/k_quants.c.o.d -o CMakeFiles/ggml.dir/k_quants.c.o -c /build/go-llama/llama.cpp/k_quants.c
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | [  8%] Built target ggml
localai-api-1  | /usr/bin/gmake  -f CMakeFiles/ggml_static.dir/build.make CMakeFiles/ggml_static.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | cd /build/go-llama/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-llama/llama.cpp /build/go-llama/llama.cpp /build/go-llama/build /build/go-llama/build /build/go-llama/build/CMakeFiles/ggml_static.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | /usr/bin/gmake  -f CMakeFiles/ggml_static.dir/build.make CMakeFiles/ggml_static.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | [ 10%] Linking C static library libggml_static.a
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -P CMakeFiles/ggml_static.dir/cmake_clean_target.cmake
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/ggml_static.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/ar qc libggml_static.a CMakeFiles/ggml.dir/ggml.c.o "CMakeFiles/ggml.dir/ggml-alloc.c.o" CMakeFiles/ggml.dir/k_quants.c.o
localai-api-1  | /usr/bin/ranlib libggml_static.a
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | [ 10%] Built target ggml_static
localai-api-1  | /usr/bin/gmake  -f CMakeFiles/llama.dir/build.make CMakeFiles/llama.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | cd /build/go-llama/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-llama/llama.cpp /build/go-llama/llama.cpp /build/go-llama/build /build/go-llama/build /build/go-llama/build/CMakeFiles/llama.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | /usr/bin/gmake  -f CMakeFiles/llama.dir/build.make CMakeFiles/llama.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | [ 12%] Building CXX object CMakeFiles/llama.dir/llama.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_USE_K_QUANTS -I/build/go-llama/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -MD -MT CMakeFiles/llama.dir/llama.cpp.o -MF CMakeFiles/llama.dir/llama.cpp.o.d -o CMakeFiles/llama.dir/llama.cpp.o -c /build/go-llama/llama.cpp/llama.cpp
localai-api-1  | [ 14%] Linking CXX static library libllama.a
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -P CMakeFiles/llama.dir/cmake_clean_target.cmake
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/llama.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/ar qc libllama.a CMakeFiles/llama.dir/llama.cpp.o CMakeFiles/ggml.dir/ggml.c.o "CMakeFiles/ggml.dir/ggml-alloc.c.o" CMakeFiles/ggml.dir/k_quants.c.o
localai-api-1  | /usr/bin/ranlib libllama.a
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | [ 14%] Built target llama
localai-api-1  | /usr/bin/gmake  -f tests/CMakeFiles/test-quantize-fns.dir/build.make tests/CMakeFiles/test-quantize-fns.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | cd /build/go-llama/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-llama/llama.cpp /build/go-llama/llama.cpp/tests /build/go-llama/build /build/go-llama/build/tests /build/go-llama/build/tests/CMakeFiles/test-quantize-fns.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | /usr/bin/gmake  -f tests/CMakeFiles/test-quantize-fns.dir/build.make tests/CMakeFiles/test-quantize-fns.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | [ 16%] Building CXX object tests/CMakeFiles/test-quantize-fns.dir/test-quantize-fns.cpp.o
localai-api-1  | cd /build/go-llama/build/tests && /usr/bin/c++ -DGGML_USE_K_QUANTS -I/build/go-llama/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT tests/CMakeFiles/test-quantize-fns.dir/test-quantize-fns.cpp.o -MF CMakeFiles/test-quantize-fns.dir/test-quantize-fns.cpp.o.d -o CMakeFiles/test-quantize-fns.dir/test-quantize-fns.cpp.o -c /build/go-llama/llama.cpp/tests/test-quantize-fns.cpp
localai-api-1  | [ 18%] Linking CXX executable ../bin/test-quantize-fns
localai-api-1  | cd /build/go-llama/build/tests && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/test-quantize-fns.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG "CMakeFiles/test-quantize-fns.dir/test-quantize-fns.cpp.o" -o ../bin/test-quantize-fns  ../libllama.a -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | [ 18%] Built target test-quantize-fns
localai-api-1  | /usr/bin/gmake  -f tests/CMakeFiles/test-quantize-perf.dir/build.make tests/CMakeFiles/test-quantize-perf.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | cd /build/go-llama/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-llama/llama.cpp /build/go-llama/llama.cpp/tests /build/go-llama/build /build/go-llama/build/tests /build/go-llama/build/tests/CMakeFiles/test-quantize-perf.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | /usr/bin/gmake  -f tests/CMakeFiles/test-quantize-perf.dir/build.make tests/CMakeFiles/test-quantize-perf.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | [ 20%] Building CXX object tests/CMakeFiles/test-quantize-perf.dir/test-quantize-perf.cpp.o
localai-api-1  | cd /build/go-llama/build/tests && /usr/bin/c++ -DGGML_USE_K_QUANTS -I/build/go-llama/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT tests/CMakeFiles/test-quantize-perf.dir/test-quantize-perf.cpp.o -MF CMakeFiles/test-quantize-perf.dir/test-quantize-perf.cpp.o.d -o CMakeFiles/test-quantize-perf.dir/test-quantize-perf.cpp.o -c /build/go-llama/llama.cpp/tests/test-quantize-perf.cpp
localai-api-1  | [ 22%] Linking CXX executable ../bin/test-quantize-perf
localai-api-1  | cd /build/go-llama/build/tests && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/test-quantize-perf.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG "CMakeFiles/test-quantize-perf.dir/test-quantize-perf.cpp.o" -o ../bin/test-quantize-perf  ../libllama.a -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | [ 22%] Built target test-quantize-perf
localai-api-1  | /usr/bin/gmake  -f tests/CMakeFiles/test-sampling.dir/build.make tests/CMakeFiles/test-sampling.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | cd /build/go-llama/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-llama/llama.cpp /build/go-llama/llama.cpp/tests /build/go-llama/build /build/go-llama/build/tests /build/go-llama/build/tests/CMakeFiles/test-sampling.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | /usr/bin/gmake  -f tests/CMakeFiles/test-sampling.dir/build.make tests/CMakeFiles/test-sampling.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | [ 24%] Building CXX object tests/CMakeFiles/test-sampling.dir/test-sampling.cpp.o
localai-api-1  | cd /build/go-llama/build/tests && /usr/bin/c++ -DGGML_USE_K_QUANTS -I/build/go-llama/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT tests/CMakeFiles/test-sampling.dir/test-sampling.cpp.o -MF CMakeFiles/test-sampling.dir/test-sampling.cpp.o.d -o CMakeFiles/test-sampling.dir/test-sampling.cpp.o -c /build/go-llama/llama.cpp/tests/test-sampling.cpp
localai-api-1  | [ 26%] Linking CXX executable ../bin/test-sampling
localai-api-1  | cd /build/go-llama/build/tests && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/test-sampling.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG "CMakeFiles/test-sampling.dir/test-sampling.cpp.o" -o ../bin/test-sampling  ../libllama.a -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | [ 26%] Built target test-sampling
localai-api-1  | /usr/bin/gmake  -f tests/CMakeFiles/test-tokenizer-0.dir/build.make tests/CMakeFiles/test-tokenizer-0.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | cd /build/go-llama/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-llama/llama.cpp /build/go-llama/llama.cpp/tests /build/go-llama/build /build/go-llama/build/tests /build/go-llama/build/tests/CMakeFiles/test-tokenizer-0.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | /usr/bin/gmake  -f tests/CMakeFiles/test-tokenizer-0.dir/build.make tests/CMakeFiles/test-tokenizer-0.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | [ 28%] Building CXX object tests/CMakeFiles/test-tokenizer-0.dir/test-tokenizer-0.cpp.o
localai-api-1  | cd /build/go-llama/build/tests && /usr/bin/c++ -DGGML_USE_K_QUANTS -I/build/go-llama/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT tests/CMakeFiles/test-tokenizer-0.dir/test-tokenizer-0.cpp.o -MF CMakeFiles/test-tokenizer-0.dir/test-tokenizer-0.cpp.o.d -o CMakeFiles/test-tokenizer-0.dir/test-tokenizer-0.cpp.o -c /build/go-llama/llama.cpp/tests/test-tokenizer-0.cpp
localai-api-1  | /build/go-llama/llama.cpp/tests/test-tokenizer-0.cpp:19:2: warning: extra ‘;’ [-Wpedantic]
localai-api-1  |    19 | };
localai-api-1  |       |  ^
localai-api-1  | [ 30%] Linking CXX executable ../bin/test-tokenizer-0
localai-api-1  | cd /build/go-llama/build/tests && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/test-tokenizer-0.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG "CMakeFiles/test-tokenizer-0.dir/test-tokenizer-0.cpp.o" -o ../bin/test-tokenizer-0  ../libllama.a -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | [ 30%] Built target test-tokenizer-0
localai-api-1  | /usr/bin/gmake  -f tests/CMakeFiles/test-grad0.dir/build.make tests/CMakeFiles/test-grad0.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | cd /build/go-llama/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-llama/llama.cpp /build/go-llama/llama.cpp/tests /build/go-llama/build /build/go-llama/build/tests /build/go-llama/build/tests/CMakeFiles/test-grad0.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | /usr/bin/gmake  -f tests/CMakeFiles/test-grad0.dir/build.make tests/CMakeFiles/test-grad0.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | [ 32%] Building CXX object tests/CMakeFiles/test-grad0.dir/test-grad0.cpp.o
localai-api-1  | cd /build/go-llama/build/tests && /usr/bin/c++ -DGGML_USE_K_QUANTS -I/build/go-llama/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT tests/CMakeFiles/test-grad0.dir/test-grad0.cpp.o -MF CMakeFiles/test-grad0.dir/test-grad0.cpp.o.d -o CMakeFiles/test-grad0.dir/test-grad0.cpp.o -c /build/go-llama/llama.cpp/tests/test-grad0.cpp
localai-api-1  | [ 34%] Linking CXX executable ../bin/test-grad0
localai-api-1  | cd /build/go-llama/build/tests && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/test-grad0.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG "CMakeFiles/test-grad0.dir/test-grad0.cpp.o" -o ../bin/test-grad0  ../libllama.a -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | [ 34%] Built target test-grad0
localai-api-1  | /usr/bin/gmake  -f examples/CMakeFiles/common.dir/build.make examples/CMakeFiles/common.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | cd /build/go-llama/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-llama/llama.cpp /build/go-llama/llama.cpp/examples /build/go-llama/build /build/go-llama/build/examples /build/go-llama/build/examples/CMakeFiles/common.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | /usr/bin/gmake  -f examples/CMakeFiles/common.dir/build.make examples/CMakeFiles/common.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | [ 36%] Building CXX object examples/CMakeFiles/common.dir/common.cpp.o
localai-api-1  | cd /build/go-llama/build/examples && /usr/bin/c++ -DGGML_USE_K_QUANTS -I/build/go-llama/llama.cpp/examples/. -I/build/go-llama/llama.cpp/examples -I/build/go-llama/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT examples/CMakeFiles/common.dir/common.cpp.o -MF CMakeFiles/common.dir/common.cpp.o.d -o CMakeFiles/common.dir/common.cpp.o -c /build/go-llama/llama.cpp/examples/common.cpp
localai-api-1  | [ 38%] Building CXX object examples/CMakeFiles/common.dir/grammar-parser.cpp.o
localai-api-1  | cd /build/go-llama/build/examples && /usr/bin/c++ -DGGML_USE_K_QUANTS -I/build/go-llama/llama.cpp/examples/. -I/build/go-llama/llama.cpp/examples -I/build/go-llama/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT examples/CMakeFiles/common.dir/grammar-parser.cpp.o -MF CMakeFiles/common.dir/grammar-parser.cpp.o.d -o CMakeFiles/common.dir/grammar-parser.cpp.o -c /build/go-llama/llama.cpp/examples/grammar-parser.cpp
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | [ 38%] Built target common
localai-api-1  | /usr/bin/gmake  -f examples/main/CMakeFiles/main.dir/build.make examples/main/CMakeFiles/main.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | cd /build/go-llama/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-llama/llama.cpp /build/go-llama/llama.cpp/examples/main /build/go-llama/build /build/go-llama/build/examples/main /build/go-llama/build/examples/main/CMakeFiles/main.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | /usr/bin/gmake  -f examples/main/CMakeFiles/main.dir/build.make examples/main/CMakeFiles/main.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | [ 40%] Building CXX object examples/main/CMakeFiles/main.dir/main.cpp.o
localai-api-1  | cd /build/go-llama/build/examples/main && /usr/bin/c++ -DGGML_USE_K_QUANTS -I/build/go-llama/llama.cpp/examples -I/build/go-llama/llama.cpp/examples/. -I/build/go-llama/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT examples/main/CMakeFiles/main.dir/main.cpp.o -MF CMakeFiles/main.dir/main.cpp.o.d -o CMakeFiles/main.dir/main.cpp.o -c /build/go-llama/llama.cpp/examples/main/main.cpp
localai-api-1  | [ 42%] Linking CXX executable ../../bin/main
localai-api-1  | cd /build/go-llama/build/examples/main && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/main.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG CMakeFiles/main.dir/main.cpp.o ../CMakeFiles/common.dir/common.cpp.o "../CMakeFiles/common.dir/grammar-parser.cpp.o" -o ../../bin/main  ../../libllama.a -pthread -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | [ 42%] Built target main
localai-api-1  | /usr/bin/gmake  -f examples/quantize/CMakeFiles/quantize.dir/build.make examples/quantize/CMakeFiles/quantize.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | cd /build/go-llama/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-llama/llama.cpp /build/go-llama/llama.cpp/examples/quantize /build/go-llama/build /build/go-llama/build/examples/quantize /build/go-llama/build/examples/quantize/CMakeFiles/quantize.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | /usr/bin/gmake  -f examples/quantize/CMakeFiles/quantize.dir/build.make examples/quantize/CMakeFiles/quantize.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | [ 44%] Building CXX object examples/quantize/CMakeFiles/quantize.dir/quantize.cpp.o
localai-api-1  | cd /build/go-llama/build/examples/quantize && /usr/bin/c++ -DGGML_USE_K_QUANTS -I/build/go-llama/llama.cpp/examples -I/build/go-llama/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT examples/quantize/CMakeFiles/quantize.dir/quantize.cpp.o -MF CMakeFiles/quantize.dir/quantize.cpp.o.d -o CMakeFiles/quantize.dir/quantize.cpp.o -c /build/go-llama/llama.cpp/examples/quantize/quantize.cpp
localai-api-1  | [ 46%] Linking CXX executable ../../bin/quantize
localai-api-1  | cd /build/go-llama/build/examples/quantize && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/quantize.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG CMakeFiles/quantize.dir/quantize.cpp.o -o ../../bin/quantize  ../../libllama.a -pthread -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | [ 46%] Built target quantize
localai-api-1  | /usr/bin/gmake  -f examples/quantize-stats/CMakeFiles/quantize-stats.dir/build.make examples/quantize-stats/CMakeFiles/quantize-stats.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | cd /build/go-llama/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-llama/llama.cpp /build/go-llama/llama.cpp/examples/quantize-stats /build/go-llama/build /build/go-llama/build/examples/quantize-stats /build/go-llama/build/examples/quantize-stats/CMakeFiles/quantize-stats.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | /usr/bin/gmake  -f examples/quantize-stats/CMakeFiles/quantize-stats.dir/build.make examples/quantize-stats/CMakeFiles/quantize-stats.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | [ 48%] Building CXX object examples/quantize-stats/CMakeFiles/quantize-stats.dir/quantize-stats.cpp.o
localai-api-1  | cd /build/go-llama/build/examples/quantize-stats && /usr/bin/c++ -DGGML_USE_K_QUANTS -I/build/go-llama/llama.cpp/examples -I/build/go-llama/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT examples/quantize-stats/CMakeFiles/quantize-stats.dir/quantize-stats.cpp.o -MF CMakeFiles/quantize-stats.dir/quantize-stats.cpp.o.d -o CMakeFiles/quantize-stats.dir/quantize-stats.cpp.o -c /build/go-llama/llama.cpp/examples/quantize-stats/quantize-stats.cpp
localai-api-1  | [ 51%] Linking CXX executable ../../bin/quantize-stats
localai-api-1  | cd /build/go-llama/build/examples/quantize-stats && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/quantize-stats.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG "CMakeFiles/quantize-stats.dir/quantize-stats.cpp.o" -o ../../bin/quantize-stats  ../../libllama.a -pthread -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | [ 51%] Built target quantize-stats
localai-api-1  | /usr/bin/gmake  -f examples/perplexity/CMakeFiles/perplexity.dir/build.make examples/perplexity/CMakeFiles/perplexity.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | cd /build/go-llama/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-llama/llama.cpp /build/go-llama/llama.cpp/examples/perplexity /build/go-llama/build /build/go-llama/build/examples/perplexity /build/go-llama/build/examples/perplexity/CMakeFiles/perplexity.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | /usr/bin/gmake  -f examples/perplexity/CMakeFiles/perplexity.dir/build.make examples/perplexity/CMakeFiles/perplexity.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | [ 53%] Building CXX object examples/perplexity/CMakeFiles/perplexity.dir/perplexity.cpp.o
localai-api-1  | cd /build/go-llama/build/examples/perplexity && /usr/bin/c++ -DGGML_USE_K_QUANTS -I/build/go-llama/llama.cpp/examples -I/build/go-llama/llama.cpp/examples/. -I/build/go-llama/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT examples/perplexity/CMakeFiles/perplexity.dir/perplexity.cpp.o -MF CMakeFiles/perplexity.dir/perplexity.cpp.o.d -o CMakeFiles/perplexity.dir/perplexity.cpp.o -c /build/go-llama/llama.cpp/examples/perplexity/perplexity.cpp
localai-api-1  | [ 55%] Linking CXX executable ../../bin/perplexity
localai-api-1  | cd /build/go-llama/build/examples/perplexity && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/perplexity.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG CMakeFiles/perplexity.dir/perplexity.cpp.o ../CMakeFiles/common.dir/common.cpp.o "../CMakeFiles/common.dir/grammar-parser.cpp.o" -o ../../bin/perplexity  ../../libllama.a -pthread -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | [ 55%] Built target perplexity
localai-api-1  | /usr/bin/gmake  -f examples/embedding/CMakeFiles/embedding.dir/build.make examples/embedding/CMakeFiles/embedding.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | cd /build/go-llama/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-llama/llama.cpp /build/go-llama/llama.cpp/examples/embedding /build/go-llama/build /build/go-llama/build/examples/embedding /build/go-llama/build/examples/embedding/CMakeFiles/embedding.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | /usr/bin/gmake  -f examples/embedding/CMakeFiles/embedding.dir/build.make examples/embedding/CMakeFiles/embedding.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | [ 57%] Building CXX object examples/embedding/CMakeFiles/embedding.dir/embedding.cpp.o
localai-api-1  | cd /build/go-llama/build/examples/embedding && /usr/bin/c++ -DGGML_USE_K_QUANTS -I/build/go-llama/llama.cpp/examples -I/build/go-llama/llama.cpp/examples/. -I/build/go-llama/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT examples/embedding/CMakeFiles/embedding.dir/embedding.cpp.o -MF CMakeFiles/embedding.dir/embedding.cpp.o.d -o CMakeFiles/embedding.dir/embedding.cpp.o -c /build/go-llama/llama.cpp/examples/embedding/embedding.cpp
localai-api-1  | [ 59%] Linking CXX executable ../../bin/embedding
localai-api-1  | cd /build/go-llama/build/examples/embedding && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/embedding.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG CMakeFiles/embedding.dir/embedding.cpp.o ../CMakeFiles/common.dir/common.cpp.o "../CMakeFiles/common.dir/grammar-parser.cpp.o" -o ../../bin/embedding  ../../libllama.a -pthread -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | [ 59%] Built target embedding
localai-api-1  | /usr/bin/gmake  -f examples/save-load-state/CMakeFiles/save-load-state.dir/build.make examples/save-load-state/CMakeFiles/save-load-state.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | cd /build/go-llama/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-llama/llama.cpp /build/go-llama/llama.cpp/examples/save-load-state /build/go-llama/build /build/go-llama/build/examples/save-load-state /build/go-llama/build/examples/save-load-state/CMakeFiles/save-load-state.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | /usr/bin/gmake  -f examples/save-load-state/CMakeFiles/save-load-state.dir/build.make examples/save-load-state/CMakeFiles/save-load-state.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | [ 61%] Building CXX object examples/save-load-state/CMakeFiles/save-load-state.dir/save-load-state.cpp.o
localai-api-1  | cd /build/go-llama/build/examples/save-load-state && /usr/bin/c++ -DGGML_USE_K_QUANTS -I/build/go-llama/llama.cpp/examples -I/build/go-llama/llama.cpp/examples/. -I/build/go-llama/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT examples/save-load-state/CMakeFiles/save-load-state.dir/save-load-state.cpp.o -MF CMakeFiles/save-load-state.dir/save-load-state.cpp.o.d -o CMakeFiles/save-load-state.dir/save-load-state.cpp.o -c /build/go-llama/llama.cpp/examples/save-load-state/save-load-state.cpp
localai-api-1  | [ 63%] Linking CXX executable ../../bin/save-load-state
localai-api-1  | cd /build/go-llama/build/examples/save-load-state && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/save-load-state.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG "CMakeFiles/save-load-state.dir/save-load-state.cpp.o" ../CMakeFiles/common.dir/common.cpp.o "../CMakeFiles/common.dir/grammar-parser.cpp.o" -o ../../bin/save-load-state  ../../libllama.a -pthread -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | [ 63%] Built target save-load-state
localai-api-1  | /usr/bin/gmake  -f examples/benchmark/CMakeFiles/benchmark.dir/build.make examples/benchmark/CMakeFiles/benchmark.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | cd /build/go-llama/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-llama/llama.cpp /build/go-llama/llama.cpp/examples/benchmark /build/go-llama/build /build/go-llama/build/examples/benchmark /build/go-llama/build/examples/benchmark/CMakeFiles/benchmark.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | /usr/bin/gmake  -f examples/benchmark/CMakeFiles/benchmark.dir/build.make examples/benchmark/CMakeFiles/benchmark.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | [ 65%] Building CXX object examples/benchmark/CMakeFiles/benchmark.dir/benchmark-matmult.cpp.o
localai-api-1  | cd /build/go-llama/build/examples/benchmark && /usr/bin/c++ -DGGML_USE_K_QUANTS -I/build/go-llama/llama.cpp/examples -I/build/go-llama/llama.cpp/examples/. -I/build/go-llama/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT examples/benchmark/CMakeFiles/benchmark.dir/benchmark-matmult.cpp.o -MF CMakeFiles/benchmark.dir/benchmark-matmult.cpp.o.d -o CMakeFiles/benchmark.dir/benchmark-matmult.cpp.o -c /build/go-llama/llama.cpp/examples/benchmark/benchmark-matmult.cpp
localai-api-1  | [ 67%] Linking CXX executable ../../bin/benchmark
localai-api-1  | cd /build/go-llama/build/examples/benchmark && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/benchmark.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG "CMakeFiles/benchmark.dir/benchmark-matmult.cpp.o" ../CMakeFiles/common.dir/common.cpp.o "../CMakeFiles/common.dir/grammar-parser.cpp.o" -o ../../bin/benchmark  ../../libllama.a -pthread -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | [ 67%] Built target benchmark
localai-api-1  | /usr/bin/gmake  -f examples/baby-llama/CMakeFiles/baby-llama.dir/build.make examples/baby-llama/CMakeFiles/baby-llama.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | cd /build/go-llama/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-llama/llama.cpp /build/go-llama/llama.cpp/examples/baby-llama /build/go-llama/build /build/go-llama/build/examples/baby-llama /build/go-llama/build/examples/baby-llama/CMakeFiles/baby-llama.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | /usr/bin/gmake  -f examples/baby-llama/CMakeFiles/baby-llama.dir/build.make examples/baby-llama/CMakeFiles/baby-llama.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | [ 69%] Building CXX object examples/baby-llama/CMakeFiles/baby-llama.dir/baby-llama.cpp.o
localai-api-1  | cd /build/go-llama/build/examples/baby-llama && /usr/bin/c++ -DGGML_USE_K_QUANTS -I/build/go-llama/llama.cpp/examples -I/build/go-llama/llama.cpp/examples/. -I/build/go-llama/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT examples/baby-llama/CMakeFiles/baby-llama.dir/baby-llama.cpp.o -MF CMakeFiles/baby-llama.dir/baby-llama.cpp.o.d -o CMakeFiles/baby-llama.dir/baby-llama.cpp.o -c /build/go-llama/llama.cpp/examples/baby-llama/baby-llama.cpp
localai-api-1  | /build/go-llama/llama.cpp/examples/baby-llama/baby-llama.cpp: In function ‘int main(int, char**)’:
localai-api-1  | /build/go-llama/llama.cpp/examples/baby-llama/baby-llama.cpp:1620:32: warning: variable ‘opt_params_adam’ set but not used [-Wunused-but-set-variable]
localai-api-1  |  1620 |         struct ggml_opt_params opt_params_adam = ggml_opt_default_params(GGML_OPT_ADAM);
localai-api-1  |       |                                ^~~~~~~~~~~~~~~
localai-api-1  | [ 71%] Linking CXX executable ../../bin/baby-llama
localai-api-1  | cd /build/go-llama/build/examples/baby-llama && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/baby-llama.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG "CMakeFiles/baby-llama.dir/baby-llama.cpp.o" ../CMakeFiles/common.dir/common.cpp.o "../CMakeFiles/common.dir/grammar-parser.cpp.o" -o ../../bin/baby-llama  ../../libllama.a -pthread -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | [ 71%] Built target baby-llama
localai-api-1  | /usr/bin/gmake  -f examples/train-text-from-scratch/CMakeFiles/train-text-from-scratch.dir/build.make examples/train-text-from-scratch/CMakeFiles/train-text-from-scratch.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | cd /build/go-llama/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-llama/llama.cpp /build/go-llama/llama.cpp/examples/train-text-from-scratch /build/go-llama/build /build/go-llama/build/examples/train-text-from-scratch /build/go-llama/build/examples/train-text-from-scratch/CMakeFiles/train-text-from-scratch.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | /usr/bin/gmake  -f examples/train-text-from-scratch/CMakeFiles/train-text-from-scratch.dir/build.make examples/train-text-from-scratch/CMakeFiles/train-text-from-scratch.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | [ 73%] Building CXX object examples/train-text-from-scratch/CMakeFiles/train-text-from-scratch.dir/train-text-from-scratch.cpp.o
localai-api-1  | cd /build/go-llama/build/examples/train-text-from-scratch && /usr/bin/c++ -DGGML_USE_K_QUANTS -I/build/go-llama/llama.cpp/examples -I/build/go-llama/llama.cpp/examples/. -I/build/go-llama/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT examples/train-text-from-scratch/CMakeFiles/train-text-from-scratch.dir/train-text-from-scratch.cpp.o -MF CMakeFiles/train-text-from-scratch.dir/train-text-from-scratch.cpp.o.d -o CMakeFiles/train-text-from-scratch.dir/train-text-from-scratch.cpp.o -c /build/go-llama/llama.cpp/examples/train-text-from-scratch/train-text-from-scratch.cpp
localai-api-1  | [ 75%] Linking CXX executable ../../bin/train-text-from-scratch
localai-api-1  | cd /build/go-llama/build/examples/train-text-from-scratch && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/train-text-from-scratch.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG "CMakeFiles/train-text-from-scratch.dir/train-text-from-scratch.cpp.o" ../CMakeFiles/common.dir/common.cpp.o "../CMakeFiles/common.dir/grammar-parser.cpp.o" -o ../../bin/train-text-from-scratch  ../../libllama.a -pthread -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | [ 75%] Built target train-text-from-scratch
localai-api-1  | /usr/bin/gmake  -f examples/simple/CMakeFiles/simple.dir/build.make examples/simple/CMakeFiles/simple.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | cd /build/go-llama/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-llama/llama.cpp /build/go-llama/llama.cpp/examples/simple /build/go-llama/build /build/go-llama/build/examples/simple /build/go-llama/build/examples/simple/CMakeFiles/simple.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | /usr/bin/gmake  -f examples/simple/CMakeFiles/simple.dir/build.make examples/simple/CMakeFiles/simple.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | [ 77%] Building CXX object examples/simple/CMakeFiles/simple.dir/simple.cpp.o
localai-api-1  | cd /build/go-llama/build/examples/simple && /usr/bin/c++ -DGGML_USE_K_QUANTS -I/build/go-llama/llama.cpp/examples -I/build/go-llama/llama.cpp/examples/. -I/build/go-llama/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT examples/simple/CMakeFiles/simple.dir/simple.cpp.o -MF CMakeFiles/simple.dir/simple.cpp.o.d -o CMakeFiles/simple.dir/simple.cpp.o -c /build/go-llama/llama.cpp/examples/simple/simple.cpp
localai-api-1  | [ 79%] Linking CXX executable ../../bin/simple
localai-api-1  | cd /build/go-llama/build/examples/simple && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/simple.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG CMakeFiles/simple.dir/simple.cpp.o ../CMakeFiles/common.dir/common.cpp.o "../CMakeFiles/common.dir/grammar-parser.cpp.o" -o ../../bin/simple  ../../libllama.a -pthread -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | [ 79%] Built target simple
localai-api-1  | /usr/bin/gmake  -f examples/embd-input/CMakeFiles/embdinput.dir/build.make examples/embd-input/CMakeFiles/embdinput.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | cd /build/go-llama/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-llama/llama.cpp /build/go-llama/llama.cpp/examples/embd-input /build/go-llama/build /build/go-llama/build/examples/embd-input /build/go-llama/build/examples/embd-input/CMakeFiles/embdinput.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | /usr/bin/gmake  -f examples/embd-input/CMakeFiles/embdinput.dir/build.make examples/embd-input/CMakeFiles/embdinput.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | [ 81%] Building CXX object examples/embd-input/CMakeFiles/embdinput.dir/embd-input-lib.cpp.o
localai-api-1  | cd /build/go-llama/build/examples/embd-input && /usr/bin/c++ -DGGML_USE_K_QUANTS -I/build/go-llama/llama.cpp/examples -I/build/go-llama/llama.cpp/examples/. -I/build/go-llama/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT examples/embd-input/CMakeFiles/embdinput.dir/embd-input-lib.cpp.o -MF CMakeFiles/embdinput.dir/embd-input-lib.cpp.o.d -o CMakeFiles/embdinput.dir/embd-input-lib.cpp.o -c /build/go-llama/llama.cpp/examples/embd-input/embd-input-lib.cpp
localai-api-1  | [ 83%] Linking CXX static library libembdinput.a
localai-api-1  | cd /build/go-llama/build/examples/embd-input && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -P CMakeFiles/embdinput.dir/cmake_clean_target.cmake
localai-api-1  | cd /build/go-llama/build/examples/embd-input && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/embdinput.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/ar qc libembdinput.a "CMakeFiles/embdinput.dir/embd-input-lib.cpp.o" ../CMakeFiles/common.dir/common.cpp.o "../CMakeFiles/common.dir/grammar-parser.cpp.o"
localai-api-1  | /usr/bin/ranlib libembdinput.a
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | [ 83%] Built target embdinput
localai-api-1  | /usr/bin/gmake  -f examples/embd-input/CMakeFiles/embd-input-test.dir/build.make examples/embd-input/CMakeFiles/embd-input-test.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | cd /build/go-llama/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-llama/llama.cpp /build/go-llama/llama.cpp/examples/embd-input /build/go-llama/build /build/go-llama/build/examples/embd-input /build/go-llama/build/examples/embd-input/CMakeFiles/embd-input-test.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | /usr/bin/gmake  -f examples/embd-input/CMakeFiles/embd-input-test.dir/build.make examples/embd-input/CMakeFiles/embd-input-test.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | [ 85%] Building CXX object examples/embd-input/CMakeFiles/embd-input-test.dir/embd-input-test.cpp.o
localai-api-1  | cd /build/go-llama/build/examples/embd-input && /usr/bin/c++ -DGGML_USE_K_QUANTS -I/build/go-llama/llama.cpp/examples -I/build/go-llama/llama.cpp/examples/. -I/build/go-llama/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT examples/embd-input/CMakeFiles/embd-input-test.dir/embd-input-test.cpp.o -MF CMakeFiles/embd-input-test.dir/embd-input-test.cpp.o.d -o CMakeFiles/embd-input-test.dir/embd-input-test.cpp.o -c /build/go-llama/llama.cpp/examples/embd-input/embd-input-test.cpp
localai-api-1  | [ 87%] Linking CXX executable ../../bin/embd-input-test
localai-api-1  | cd /build/go-llama/build/examples/embd-input && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/embd-input-test.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG "CMakeFiles/embd-input-test.dir/embd-input-test.cpp.o" ../CMakeFiles/common.dir/common.cpp.o "../CMakeFiles/common.dir/grammar-parser.cpp.o" -o ../../bin/embd-input-test  ../../libllama.a libembdinput.a -pthread ../../libllama.a -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | [ 87%] Built target embd-input-test
localai-api-1  | /usr/bin/gmake  -f examples/server/CMakeFiles/server.dir/build.make examples/server/CMakeFiles/server.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | cd /build/go-llama/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-llama/llama.cpp /build/go-llama/llama.cpp/examples/server /build/go-llama/build /build/go-llama/build/examples/server /build/go-llama/build/examples/server/CMakeFiles/server.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | /usr/bin/gmake  -f examples/server/CMakeFiles/server.dir/build.make examples/server/CMakeFiles/server.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | [ 89%] Building CXX object examples/server/CMakeFiles/server.dir/server.cpp.o
localai-api-1  | cd /build/go-llama/build/examples/server && /usr/bin/c++ -DGGML_USE_K_QUANTS -DSERVER_VERBOSE=1 -I/build/go-llama/llama.cpp/examples -I/build/go-llama/llama.cpp/examples/server -I/build/go-llama/llama.cpp/examples/. -I/build/go-llama/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT examples/server/CMakeFiles/server.dir/server.cpp.o -MF CMakeFiles/server.dir/server.cpp.o.d -o CMakeFiles/server.dir/server.cpp.o -c /build/go-llama/llama.cpp/examples/server/server.cpp
localai-api-1  | [ 91%] Linking CXX executable ../../bin/server
localai-api-1  | cd /build/go-llama/build/examples/server && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/server.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG CMakeFiles/server.dir/server.cpp.o ../CMakeFiles/common.dir/common.cpp.o "../CMakeFiles/common.dir/grammar-parser.cpp.o" -o ../../bin/server  ../../libllama.a -pthread -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | [ 91%] Built target server
localai-api-1  | /usr/bin/gmake  -f pocs/vdot/CMakeFiles/vdot.dir/build.make pocs/vdot/CMakeFiles/vdot.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | cd /build/go-llama/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-llama/llama.cpp /build/go-llama/llama.cpp/pocs/vdot /build/go-llama/build /build/go-llama/build/pocs/vdot /build/go-llama/build/pocs/vdot/CMakeFiles/vdot.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | /usr/bin/gmake  -f pocs/vdot/CMakeFiles/vdot.dir/build.make pocs/vdot/CMakeFiles/vdot.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | [ 93%] Building CXX object pocs/vdot/CMakeFiles/vdot.dir/vdot.cpp.o
localai-api-1  | cd /build/go-llama/build/pocs/vdot && /usr/bin/c++ -DGGML_USE_K_QUANTS -I/build/go-llama/llama.cpp/pocs -I/build/go-llama/llama.cpp/examples/. -I/build/go-llama/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT pocs/vdot/CMakeFiles/vdot.dir/vdot.cpp.o -MF CMakeFiles/vdot.dir/vdot.cpp.o.d -o CMakeFiles/vdot.dir/vdot.cpp.o -c /build/go-llama/llama.cpp/pocs/vdot/vdot.cpp
localai-api-1  | [ 95%] Linking CXX executable ../../bin/vdot
localai-api-1  | cd /build/go-llama/build/pocs/vdot && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/vdot.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG CMakeFiles/vdot.dir/vdot.cpp.o ../../examples/CMakeFiles/common.dir/common.cpp.o "../../examples/CMakeFiles/common.dir/grammar-parser.cpp.o" -o ../../bin/vdot  ../../libllama.a -pthread -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | [ 95%] Built target vdot
localai-api-1  | /usr/bin/gmake  -f pocs/vdot/CMakeFiles/q8dot.dir/build.make pocs/vdot/CMakeFiles/q8dot.dir/depend
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | cd /build/go-llama/build && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/go-llama/llama.cpp /build/go-llama/llama.cpp/pocs/vdot /build/go-llama/build /build/go-llama/build/pocs/vdot /build/go-llama/build/pocs/vdot/CMakeFiles/q8dot.dir/DependInfo.cmake "--color="
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | /usr/bin/gmake  -f pocs/vdot/CMakeFiles/q8dot.dir/build.make pocs/vdot/CMakeFiles/q8dot.dir/build
localai-api-1  | gmake[4]: Entering directory '/build/go-llama/build'
localai-api-1  | [ 97%] Building CXX object pocs/vdot/CMakeFiles/q8dot.dir/q8dot.cpp.o
localai-api-1  | cd /build/go-llama/build/pocs/vdot && /usr/bin/c++ -DGGML_USE_K_QUANTS -I/build/go-llama/llama.cpp/pocs -I/build/go-llama/llama.cpp/examples/. -I/build/go-llama/llama.cpp/. -O3 -DNDEBUG -std=gnu++11 -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT pocs/vdot/CMakeFiles/q8dot.dir/q8dot.cpp.o -MF CMakeFiles/q8dot.dir/q8dot.cpp.o.d -o CMakeFiles/q8dot.dir/q8dot.cpp.o -c /build/go-llama/llama.cpp/pocs/vdot/q8dot.cpp
localai-api-1  | [100%] Linking CXX executable ../../bin/q8dot
localai-api-1  | cd /build/go-llama/build/pocs/vdot && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/q8dot.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -O3 -DNDEBUG CMakeFiles/q8dot.dir/q8dot.cpp.o ../../examples/CMakeFiles/common.dir/common.cpp.o "../../examples/CMakeFiles/common.dir/grammar-parser.cpp.o" -o ../../bin/q8dot  ../../libllama.a -pthread -pthread 
localai-api-1  | gmake[4]: Leaving directory '/build/go-llama/build'
localai-api-1  | [100%] Built target q8dot
localai-api-1  | gmake[3]: Leaving directory '/build/go-llama/build'
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_progress_start /build/go-llama/build/CMakeFiles 0
localai-api-1  | gmake[2]: Leaving directory '/build/go-llama/build'
localai-api-1  | 
localai-api-1  | cd build && cp -rf CMakeFiles/llama.dir/llama.cpp.o ../llama.cpp/llama.o
localai-api-1  | cd build && cp -rf examples/CMakeFiles/common.dir/common.cpp.o ../llama.cpp/common.o
localai-api-1  | cd build && cp -rf examples/CMakeFiles/common.dir/grammar-parser.cpp.o ../llama.cpp/grammar-parser.o
localai-api-1  | cd build && cp -rf CMakeFiles/ggml.dir/ggml-alloc.c.o ../llama.cpp/ggml-alloc.o
localai-api-1  | g++ -I./llama.cpp -I. -I./llama.cpp/examples -I./examples -O3 -DNDEBUG -std=c++11 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -pthread -I./llama.cpp -I./llama.cpp/examples binding.cpp -o binding.o -c 
localai-api-1  | binding.cpp: In function 'int llama_predict(void*, void*, char*, bool)':
localai-api-1  | binding.cpp:533:42: warning: cast from type 'const char*' to type 'char*' casts away qualifiers [-Wcast-qual]
localai-api-1  |   533 |             if (!tokenCallback(state_pr, (char*)token_str)) {
localai-api-1  |       |                                          ^~~~~~~~~~~~~~~~
localai-api-1  | binding.cpp:591:1: warning: label 'end' defined but not used [-Wunused-label]
localai-api-1  |   591 | end:
localai-api-1  |       | ^~~
localai-api-1  | binding.cpp: In function 'void llama_binding_free_model(void*)':
localai-api-1  | binding.cpp:613:5: warning: possible problem detected in invocation of 'operator delete' [-Wdelete-incomplete]
localai-api-1  |   613 |     delete ctx->model;
localai-api-1  |       |     ^~~~~~~~~~~~~~~~~
localai-api-1  | binding.cpp:613:17: warning: invalid use of incomplete type 'struct llama_model'
localai-api-1  |   613 |     delete ctx->model;
localai-api-1  |       |            ~~~~~^~~~~
localai-api-1  | In file included from ./llama.cpp/examples/common.h:5,
localai-api-1  |                  from binding.cpp:1:
localai-api-1  | ./llama.cpp/llama.h:70:12: note: forward declaration of 'struct llama_model'
localai-api-1  |    70 |     struct llama_model;
localai-api-1  |       |            ^~~~~~~~~~~
localai-api-1  | binding.cpp:613:5: note: neither the destructor nor the class-specific 'operator delete' will be called, even if they are declared when the class is defined
localai-api-1  |   613 |     delete ctx->model;
localai-api-1  |       |     ^~~~~~~~~~~~~~~~~
localai-api-1  | cd build && cp -rf CMakeFiles/ggml.dir/k_quants.c.o ../llama.cpp/k_quants.o
localai-api-1  | ar src libbinding.a llama.cpp/ggml.o llama.cpp/k_quants.o  llama.cpp/ggml-alloc.o llama.cpp/common.o llama.cpp/grammar-parser.o llama.cpp/llama.o binding.o
localai-api-1  | make[1]: Leaving directory '/build/go-llama'
localai-api-1  | CGO_LDFLAGS="" C_INCLUDE_PATH=/build/go-llama LIBRARY_PATH=/build/go-llama \
localai-api-1  | go build -ldflags "-X "github.com/go-skynet/LocalAI/internal.Version=v1.23.2" -X "github.com/go-skynet/LocalAI/internal.Commit=acd829a7a0e1623c0871c8b34c36c76afd4feac8"" -tags "" -o backend-assets/grpc/llama ./cmd/grpc/llama/
localai-api-1  | # github.com/go-skynet/go-llama.cpp
localai-api-1  | binding.cpp: In function 'void llama_binding_free_model(void*)':
localai-api-1  | binding.cpp:613:5: warning: possible problem detected in invocation of 'operator delete' [-Wdelete-incomplete]
localai-api-1  |   613 |     delete ctx->model;
localai-api-1  |       |     ^~~~~~~~~~~~~~~~~
localai-api-1  | binding.cpp:613:17: warning: invalid use of incomplete type 'struct llama_model'
localai-api-1  |   613 |     delete ctx->model;
localai-api-1  |       |            ~~~~~^~~~~
localai-api-1  | In file included from go-llama/llama.cpp/examples/common.h:5,
localai-api-1  |                  from binding.cpp:1:
localai-api-1  | go-llama/llama.cpp/llama.h:70:12: note: forward declaration of 'struct llama_model'
localai-api-1  |    70 |     struct llama_model;
localai-api-1  |       |            ^~~~~~~~~~~
localai-api-1  | binding.cpp:613:5: note: neither the destructor nor the class-specific 'operator delete' will be called, even if they are declared when the class is defined
localai-api-1  |   613 |     delete ctx->model;
localai-api-1  |       |     ^~~~~~~~~~~~~~~~~
localai-api-1  | make -C gpt4all/gpt4all-bindings/golang/ libgpt4all.a
localai-api-1  | make[1]: Entering directory '/build/gpt4all/gpt4all-bindings/golang'
localai-api-1  | I go-gpt4all build info: 
localai-api-1  | I UNAME_S:  Linux
localai-api-1  | I UNAME_P:  unknown
localai-api-1  | I UNAME_M:  x86_64
localai-api-1  | I CFLAGS:   -I. -I../../gpt4all-backend/llama.cpp -I../../gpt4all-backend -I -O3 -DNDEBUG -std=c11 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wdouble-promotion -Wshadow -Wstrict-prototypes -Wpointer-arith -Wno-unused-function -pthread -march=native -mtune=native
localai-api-1  | I CXXFLAGS: -I. -I../../gpt4all-backend/llama.cpp -I../../gpt4all-backend -O3 -DNDEBUG -std=c++17 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -march=native -mtune=native
localai-api-1  | I LDFLAGS:  
localai-api-1  | I CMAKEFLAGS:  
localai-api-1  | I CC:       cc (Debian 10.2.1-6) 10.2.1 20210110
localai-api-1  | I CXX:      g++ (Debian 10.2.1-6) 10.2.1 20210110
localai-api-1  | 
localai-api-1  | g++ -I. -I../../gpt4all-backend/llama.cpp -I../../gpt4all-backend -O3 -DNDEBUG -std=c++17 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -march=native -mtune=native binding.cpp -o binding.o -c 
localai-api-1  | binding.cpp: In lambda function:
localai-api-1  | binding.cpp:46:33: warning: unused parameter 'token_id' [-Wunused-parameter]
localai-api-1  |    46 |     auto lambda_prompt = [](int token_id)  {
localai-api-1  |       |                             ~~~~^~~~~~~~
localai-api-1  | binding.cpp: In lambda function:
localai-api-1  | binding.cpp:54:20: warning: cast from type 'const char*' to type 'char*' casts away qualifiers [-Wcast-qual]
localai-api-1  |    54 |         res.append((char*)responsechars);
localai-api-1  |       |                    ^~~~~~~~~~~~~~~~~~~~
localai-api-1  | binding.cpp:55:39: warning: cast from type 'const char*' to type 'char*' casts away qualifiers [-Wcast-qual]
localai-api-1  |    55 |         return !!getTokenCallback(mm, (char*)responsechars);
localai-api-1  |       |                                       ^~~~~~~~~~~~~~~~~~~~
localai-api-1  | binding.cpp:53:35: warning: unused parameter 'token_id' [-Wunused-parameter]
localai-api-1  |    53 |     auto lambda_response = [](int token_id, const char *responsechars) {
localai-api-1  |       |                               ~~~~^~~~~~~~
localai-api-1  | binding.cpp: In function 'void model_prompt(const char*, void*, char*, int, float, int, int, int, float, float, int, float)':
localai-api-1  | binding.cpp:64:9: warning: C++ designated initializers only available with '-std=c++2a' or '-std=gnu++2a' [-Wpedantic]
localai-api-1  |    64 |         .logits = NULL,
localai-api-1  |       |         ^
localai-api-1  | binding.cpp:65:9: warning: C++ designated initializers only available with '-std=c++2a' or '-std=gnu++2a' [-Wpedantic]
localai-api-1  |    65 |         .logits_size = 0,
localai-api-1  |       |         ^
localai-api-1  | binding.cpp:66:9: warning: C++ designated initializers only available with '-std=c++2a' or '-std=gnu++2a' [-Wpedantic]
localai-api-1  |    66 |         .tokens = NULL,
localai-api-1  |       |         ^
localai-api-1  | binding.cpp:67:9: warning: C++ designated initializers only available with '-std=c++2a' or '-std=gnu++2a' [-Wpedantic]
localai-api-1  |    67 |         .tokens_size = 0,
localai-api-1  |       |         ^
localai-api-1  | binding.cpp:68:9: warning: C++ designated initializers only available with '-std=c++2a' or '-std=gnu++2a' [-Wpedantic]
localai-api-1  |    68 |         .n_past = 0,
localai-api-1  |       |         ^
localai-api-1  | binding.cpp:69:9: warning: C++ designated initializers only available with '-std=c++2a' or '-std=gnu++2a' [-Wpedantic]
localai-api-1  |    69 |         .n_ctx = 1024,
localai-api-1  |       |         ^
localai-api-1  | binding.cpp:70:9: warning: C++ designated initializers only available with '-std=c++2a' or '-std=gnu++2a' [-Wpedantic]
localai-api-1  |    70 |         .n_predict = 50,
localai-api-1  |       |         ^
localai-api-1  | binding.cpp:71:9: warning: C++ designated initializers only available with '-std=c++2a' or '-std=gnu++2a' [-Wpedantic]
localai-api-1  |    71 |         .top_k = 10,
localai-api-1  |       |         ^
localai-api-1  | binding.cpp:72:9: warning: C++ designated initializers only available with '-std=c++2a' or '-std=gnu++2a' [-Wpedantic]
localai-api-1  |    72 |         .top_p = 0.9,
localai-api-1  |       |         ^
localai-api-1  | binding.cpp:73:9: warning: C++ designated initializers only available with '-std=c++2a' or '-std=gnu++2a' [-Wpedantic]
localai-api-1  |    73 |         .temp = 1.0,
localai-api-1  |       |         ^
localai-api-1  | binding.cpp:74:9: warning: C++ designated initializers only available with '-std=c++2a' or '-std=gnu++2a' [-Wpedantic]
localai-api-1  |    74 |         .n_batch = 1,
localai-api-1  |       |         ^
localai-api-1  | binding.cpp:75:9: warning: C++ designated initializers only available with '-std=c++2a' or '-std=gnu++2a' [-Wpedantic]
localai-api-1  |    75 |         .repeat_penalty = 1.2,
localai-api-1  |       |         ^
localai-api-1  | binding.cpp:76:9: warning: C++ designated initializers only available with '-std=c++2a' or '-std=gnu++2a' [-Wpedantic]
localai-api-1  |    76 |         .repeat_last_n = 10,
localai-api-1  |       |         ^
localai-api-1  | binding.cpp:77:9: warning: C++ designated initializers only available with '-std=c++2a' or '-std=gnu++2a' [-Wpedantic]
localai-api-1  |    77 |         .context_erase = 0.5
localai-api-1  |       |         ^
localai-api-1  | mkdir buildllm
localai-api-1  | cd buildllm && cmake ../../../gpt4all-backend/  && make
localai-api-1  | -- The CXX compiler identification is GNU 10.2.1
localai-api-1  | -- The C compiler identification is GNU 10.2.1
localai-api-1  | -- Detecting CXX compiler ABI info
localai-api-1  | -- Detecting CXX compiler ABI info - done
localai-api-1  | -- Check for working CXX compiler: /usr/bin/c++ - skipped
localai-api-1  | -- Detecting CXX compile features
localai-api-1  | -- Detecting CXX compile features - done
localai-api-1  | -- Detecting C compiler ABI info
localai-api-1  | -- Detecting C compiler ABI info - done
localai-api-1  | -- Check for working C compiler: /usr/bin/cc - skipped
localai-api-1  | -- Detecting C compile features
localai-api-1  | -- Detecting C compile features - done
localai-api-1  | -- Interprocedural optimization support detected
localai-api-1  | -- Performing Test CMAKE_HAVE_LIBC_PTHREAD
localai-api-1  | -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
localai-api-1  | -- Check if compiler accepts -pthread
localai-api-1  | -- Check if compiler accepts -pthread - yes
localai-api-1  | -- Found Threads: TRUE  
localai-api-1  | -- CMAKE_SYSTEM_PROCESSOR: x86_64
localai-api-1  | -- Configuring ggml implementation target llama-mainline-default in /build/gpt4all/gpt4all-backend/llama.cpp-mainline
localai-api-1  | -- x86 detected
localai-api-1  | -- Configuring ggml implementation target llama-230511-default in /build/gpt4all/gpt4all-backend/llama.cpp-230511
localai-api-1  | -- x86 detected
localai-api-1  | -- Configuring ggml implementation target llama-230519-default in /build/gpt4all/gpt4all-backend/llama.cpp-230519
localai-api-1  | -- x86 detected
localai-api-1  | -- Configuring model implementation target llamamodel-mainline-default
localai-api-1  | -- Configuring model implementation target replit-mainline-default
localai-api-1  | -- Configuring model implementation target llamamodel-230519-default
localai-api-1  | -- Configuring model implementation target llamamodel-230511-default
localai-api-1  | -- Configuring model implementation target gptj-default
localai-api-1  | -- Configuring model implementation target falcon-default
localai-api-1  | -- Configuring model implementation target mpt-default
localai-api-1  | -- Configuring model implementation target bert-default
localai-api-1  | -- Configuring model implementation target starcoder-default
localai-api-1  | -- Configuring ggml implementation target llama-mainline-avxonly in /build/gpt4all/gpt4all-backend/llama.cpp-mainline
localai-api-1  | -- x86 detected
localai-api-1  | -- Configuring ggml implementation target llama-230511-avxonly in /build/gpt4all/gpt4all-backend/llama.cpp-230511
localai-api-1  | -- x86 detected
localai-api-1  | -- Configuring ggml implementation target llama-230519-avxonly in /build/gpt4all/gpt4all-backend/llama.cpp-230519
localai-api-1  | -- x86 detected
localai-api-1  | -- Configuring model implementation target llamamodel-mainline-avxonly
localai-api-1  | -- Configuring model implementation target replit-mainline-avxonly
localai-api-1  | -- Configuring model implementation target llamamodel-230519-avxonly
localai-api-1  | -- Configuring model implementation target llamamodel-230511-avxonly
localai-api-1  | -- Configuring model implementation target gptj-avxonly
localai-api-1  | -- Configuring model implementation target falcon-avxonly
localai-api-1  | -- Configuring model implementation target mpt-avxonly
localai-api-1  | -- Configuring model implementation target bert-avxonly
localai-api-1  | -- Configuring model implementation target starcoder-avxonly
localai-api-1  | -- Configuring done (1.4s)
localai-api-1  | -- Generating done (0.1s)
localai-api-1  | -- Build files have been written to: /build/gpt4all/gpt4all-bindings/golang/buildllm
localai-api-1  | make[2]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -S/build/gpt4all/gpt4all-backend -B/build/gpt4all/gpt4all-bindings/golang/buildllm --check-build-system CMakeFiles/Makefile.cmake 0
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_progress_start /build/gpt4all/gpt4all-bindings/golang/buildllm/CMakeFiles /build/gpt4all/gpt4all-bindings/golang/buildllm//CMakeFiles/progress.marks
localai-api-1  | make  -f CMakeFiles/Makefile2 all
localai-api-1  | make[3]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | make  -f CMakeFiles/ggml-mainline-default.dir/build.make CMakeFiles/ggml-mainline-default.dir/depend
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | cd /build/gpt4all/gpt4all-bindings/golang/buildllm && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm/CMakeFiles/ggml-mainline-default.dir/DependInfo.cmake "--color="
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | make  -f CMakeFiles/ggml-mainline-default.dir/build.make CMakeFiles/ggml-mainline-default.dir/build
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [  1%] Building C object CMakeFiles/ggml-mainline-default.dir/llama.cpp-mainline/ggml.c.o
localai-api-1  | /usr/bin/cc -DGGML_USE_K_QUANTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-mainline -O3 -DNDEBUG -std=gnu11 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wdouble-promotion -Wshadow -Wstrict-prototypes -Wpointer-arith -mf16c -mfma -mavx2 -pthread -MD -MT CMakeFiles/ggml-mainline-default.dir/llama.cpp-mainline/ggml.c.o -MF CMakeFiles/ggml-mainline-default.dir/llama.cpp-mainline/ggml.c.o.d -o CMakeFiles/ggml-mainline-default.dir/llama.cpp-mainline/ggml.c.o -c /build/gpt4all/gpt4all-backend/llama.cpp-mainline/ggml.c
localai-api-1  | [  2%] Building C object CMakeFiles/ggml-mainline-default.dir/llama.cpp-mainline/k_quants.c.o
localai-api-1  | /usr/bin/cc -DGGML_USE_K_QUANTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-mainline -O3 -DNDEBUG -std=gnu11 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wdouble-promotion -Wshadow -Wstrict-prototypes -Wpointer-arith -mf16c -mfma -mavx2 -pthread -MD -MT CMakeFiles/ggml-mainline-default.dir/llama.cpp-mainline/k_quants.c.o -MF CMakeFiles/ggml-mainline-default.dir/llama.cpp-mainline/k_quants.c.o.d -o CMakeFiles/ggml-mainline-default.dir/llama.cpp-mainline/k_quants.c.o -c /build/gpt4all/gpt4all-backend/llama.cpp-mainline/k_quants.c
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [  2%] Built target ggml-mainline-default
localai-api-1  | make  -f CMakeFiles/llama-mainline-default.dir/build.make CMakeFiles/llama-mainline-default.dir/depend
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | cd /build/gpt4all/gpt4all-bindings/golang/buildllm && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm/CMakeFiles/llama-mainline-default.dir/DependInfo.cmake "--color="
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | make  -f CMakeFiles/llama-mainline-default.dir/build.make CMakeFiles/llama-mainline-default.dir/build
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [  3%] Building CXX object CMakeFiles/llama-mainline-default.dir/llama.cpp-mainline/llama.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_USE_K_QUANTS -DLLAMA_BUILD -DLLAMA_SHARED -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-mainline -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -MD -MT CMakeFiles/llama-mainline-default.dir/llama.cpp-mainline/llama.cpp.o -MF CMakeFiles/llama-mainline-default.dir/llama.cpp-mainline/llama.cpp.o.d -o CMakeFiles/llama-mainline-default.dir/llama.cpp-mainline/llama.cpp.o -c /build/gpt4all/gpt4all-backend/llama.cpp-mainline/llama.cpp
localai-api-1  | [  4%] Linking CXX static library libllama-mainline-default.a
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -P CMakeFiles/llama-mainline-default.dir/cmake_clean_target.cmake
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/llama-mainline-default.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/ar qc libllama-mainline-default.a "CMakeFiles/llama-mainline-default.dir/llama.cpp-mainline/llama.cpp.o" "CMakeFiles/ggml-mainline-default.dir/llama.cpp-mainline/ggml.c.o" "CMakeFiles/ggml-mainline-default.dir/llama.cpp-mainline/k_quants.c.o"
localai-api-1  | /usr/bin/ranlib libllama-mainline-default.a
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [  4%] Built target llama-mainline-default
localai-api-1  | make  -f CMakeFiles/ggml-230511-default.dir/build.make CMakeFiles/ggml-230511-default.dir/depend
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | cd /build/gpt4all/gpt4all-bindings/golang/buildllm && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm/CMakeFiles/ggml-230511-default.dir/DependInfo.cmake "--color="
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | make  -f CMakeFiles/ggml-230511-default.dir/build.make CMakeFiles/ggml-230511-default.dir/build
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [  5%] Building C object CMakeFiles/ggml-230511-default.dir/llama.cpp-230511/ggml.c.o
localai-api-1  | /usr/bin/cc  -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-230511 -O3 -DNDEBUG -std=gnu11 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wdouble-promotion -Wshadow -Wstrict-prototypes -Wpointer-arith -mf16c -mfma -mavx2 -pthread -MD -MT CMakeFiles/ggml-230511-default.dir/llama.cpp-230511/ggml.c.o -MF CMakeFiles/ggml-230511-default.dir/llama.cpp-230511/ggml.c.o.d -o CMakeFiles/ggml-230511-default.dir/llama.cpp-230511/ggml.c.o -c /build/gpt4all/gpt4all-backend/llama.cpp-230511/ggml.c
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-230511/ggml.c: In function 'ggml_compute_forward_alibi_f32':
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-230511/ggml.c:9357:15: warning: unused variable 'ne2_ne3' [-Wunused-variable]
localai-api-1  |  9357 |     const int ne2_ne3 = n/ne1; // ne2*ne3
localai-api-1  |       |               ^~~~~~~
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-230511/ggml.c: In function 'ggml_compute_forward_alibi_f16':
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-230511/ggml.c:9419:15: warning: unused variable 'ne2' [-Wunused-variable]
localai-api-1  |  9419 |     const int ne2 = src0->ne[2]; // n_head -> this is k
localai-api-1  |       |               ^~~
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-230511/ggml.c: In function 'ggml_compute_forward_alibi':
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-230511/ggml.c:9468:5: warning: enumeration value 'GGML_TYPE_Q4_3' not handled in switch [-Wswitch]
localai-api-1  |  9468 |     switch (src0->type) {
localai-api-1  |       |     ^~~~~~
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [  5%] Built target ggml-230511-default
localai-api-1  | make  -f CMakeFiles/llama-230511-default.dir/build.make CMakeFiles/llama-230511-default.dir/depend
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | cd /build/gpt4all/gpt4all-bindings/golang/buildllm && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm/CMakeFiles/llama-230511-default.dir/DependInfo.cmake "--color="
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | make  -f CMakeFiles/llama-230511-default.dir/build.make CMakeFiles/llama-230511-default.dir/build
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [  6%] Building CXX object CMakeFiles/llama-230511-default.dir/llama.cpp-230511/llama.cpp.o
localai-api-1  | /usr/bin/c++ -DLLAMA_BUILD -DLLAMA_SHARED -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-230511 -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -MD -MT CMakeFiles/llama-230511-default.dir/llama.cpp-230511/llama.cpp.o -MF CMakeFiles/llama-230511-default.dir/llama.cpp-230511/llama.cpp.o.d -o CMakeFiles/llama-230511-default.dir/llama.cpp-230511/llama.cpp.o -c /build/gpt4all/gpt4all-backend/llama.cpp-230511/llama.cpp
localai-api-1  | [  7%] Linking CXX static library libllama-230511-default.a
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -P CMakeFiles/llama-230511-default.dir/cmake_clean_target.cmake
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/llama-230511-default.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/ar qc libllama-230511-default.a "CMakeFiles/llama-230511-default.dir/llama.cpp-230511/llama.cpp.o" "CMakeFiles/ggml-230511-default.dir/llama.cpp-230511/ggml.c.o"
localai-api-1  | /usr/bin/ranlib libllama-230511-default.a
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [  7%] Built target llama-230511-default
localai-api-1  | make  -f CMakeFiles/ggml-230519-default.dir/build.make CMakeFiles/ggml-230519-default.dir/depend
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | cd /build/gpt4all/gpt4all-bindings/golang/buildllm && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm/CMakeFiles/ggml-230519-default.dir/DependInfo.cmake "--color="
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | make  -f CMakeFiles/ggml-230519-default.dir/build.make CMakeFiles/ggml-230519-default.dir/build
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [  8%] Building C object CMakeFiles/ggml-230519-default.dir/llama.cpp-230519/ggml.c.o
localai-api-1  | /usr/bin/cc  -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-230519 -O3 -DNDEBUG -std=gnu11 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wdouble-promotion -Wshadow -Wstrict-prototypes -Wpointer-arith -mf16c -mfma -mavx2 -pthread -MD -MT CMakeFiles/ggml-230519-default.dir/llama.cpp-230519/ggml.c.o -MF CMakeFiles/ggml-230519-default.dir/llama.cpp-230519/ggml.c.o.d -o CMakeFiles/ggml-230519-default.dir/llama.cpp-230519/ggml.c.o -c /build/gpt4all/gpt4all-backend/llama.cpp-230519/ggml.c
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [  8%] Built target ggml-230519-default
localai-api-1  | make  -f CMakeFiles/llama-230519-default.dir/build.make CMakeFiles/llama-230519-default.dir/depend
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | cd /build/gpt4all/gpt4all-bindings/golang/buildllm && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm/CMakeFiles/llama-230519-default.dir/DependInfo.cmake "--color="
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | make  -f CMakeFiles/llama-230519-default.dir/build.make CMakeFiles/llama-230519-default.dir/build
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 10%] Building CXX object CMakeFiles/llama-230519-default.dir/llama.cpp-230519/llama.cpp.o
localai-api-1  | /usr/bin/c++ -DLLAMA_BUILD -DLLAMA_SHARED -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-230519 -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -MD -MT CMakeFiles/llama-230519-default.dir/llama.cpp-230519/llama.cpp.o -MF CMakeFiles/llama-230519-default.dir/llama.cpp-230519/llama.cpp.o.d -o CMakeFiles/llama-230519-default.dir/llama.cpp-230519/llama.cpp.o -c /build/gpt4all/gpt4all-backend/llama.cpp-230519/llama.cpp
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-230519/llama.cpp: In function 'size_t llama_set_state_data(llama_context*, const uint8_t*)':
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-230519/llama.cpp:2685:27: warning: cast from type 'const uint8_t*' {aka 'const unsigned char*'} to type 'void*' casts away qualifiers [-Wcast-qual]
localai-api-1  |  2685 |             kin3d->data = (void *) inp;
localai-api-1  |       |                           ^~~~~~~~~~~~
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-230519/llama.cpp:2689:27: warning: cast from type 'const uint8_t*' {aka 'const unsigned char*'} to type 'void*' casts away qualifiers [-Wcast-qual]
localai-api-1  |  2689 |             vin3d->data = (void *) inp;
localai-api-1  |       |                           ^~~~~~~~~~~~
localai-api-1  | [ 11%] Linking CXX static library libllama-230519-default.a
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -P CMakeFiles/llama-230519-default.dir/cmake_clean_target.cmake
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/llama-230519-default.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/ar qc libllama-230519-default.a "CMakeFiles/llama-230519-default.dir/llama.cpp-230519/llama.cpp.o" "CMakeFiles/ggml-230519-default.dir/llama.cpp-230519/ggml.c.o"
localai-api-1  | /usr/bin/ranlib libllama-230519-default.a
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 11%] Built target llama-230519-default
localai-api-1  | make  -f CMakeFiles/llamamodel-mainline-default.dir/build.make CMakeFiles/llamamodel-mainline-default.dir/depend
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | cd /build/gpt4all/gpt4all-bindings/golang/buildllm && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm/CMakeFiles/llamamodel-mainline-default.dir/DependInfo.cmake "--color="
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | make  -f CMakeFiles/llamamodel-mainline-default.dir/build.make CMakeFiles/llamamodel-mainline-default.dir/build
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 12%] Building CXX object CMakeFiles/llamamodel-mainline-default.dir/llamamodel.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"default\" -DLLAMA_DATE=999999 -DLLAMA_VERSIONS=">=3" -Dllamamodel_mainline_default_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-mainline -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/llamamodel-mainline-default.dir/llamamodel.cpp.o -MF CMakeFiles/llamamodel-mainline-default.dir/llamamodel.cpp.o.d -o CMakeFiles/llamamodel-mainline-default.dir/llamamodel.cpp.o -c /build/gpt4all/gpt4all-backend/llamamodel.cpp
localai-api-1  | /build/gpt4all/gpt4all-backend/llamamodel.cpp: In member function 'virtual bool LLamaModel::loadModel(const string&)':
localai-api-1  | /build/gpt4all/gpt4all-backend/llamamodel.cpp:159:71: warning: 'llama_context* llama_init_from_file(const char*, llama_context_params)' is deprecated: please use llama_load_model_from_file combined with llama_new_context_with_model instead [-Wdeprecated-declarations]
localai-api-1  |   159 |     d_ptr->ctx = llama_init_from_file(modelPath.c_str(), d_ptr->params);
localai-api-1  |       |                                                                       ^
localai-api-1  | In file included from /build/gpt4all/gpt4all-backend/llamamodel.cpp:28:
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-mainline/llama.h:161:49: note: declared here
localai-api-1  |   161 |     LLAMA_API DEPRECATED(struct llama_context * llama_init_from_file(
localai-api-1  |       |                                                 ^~~~~~~~~~~~~~~~~~~~
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-mainline/llama.h:30:36: note: in definition of macro 'DEPRECATED'
localai-api-1  |    30 | #    define DEPRECATED(func, hint) func __attribute__((deprecated(hint)))
localai-api-1  |       |                                    ^~~~
localai-api-1  | [ 13%] Building CXX object CMakeFiles/llamamodel-mainline-default.dir/llmodel_shared.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"default\" -DLLAMA_DATE=999999 -DLLAMA_VERSIONS=">=3" -Dllamamodel_mainline_default_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-mainline -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/llamamodel-mainline-default.dir/llmodel_shared.cpp.o -MF CMakeFiles/llamamodel-mainline-default.dir/llmodel_shared.cpp.o.d -o CMakeFiles/llamamodel-mainline-default.dir/llmodel_shared.cpp.o -c /build/gpt4all/gpt4all-backend/llmodel_shared.cpp
localai-api-1  | [ 14%] Linking CXX shared library libllamamodel-mainline-default.so
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/llamamodel-mainline-default.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -fPIC -O3 -DNDEBUG -shared -Wl,-soname,libllamamodel-mainline-default.so -o libllamamodel-mainline-default.so "CMakeFiles/llamamodel-mainline-default.dir/llamamodel.cpp.o" "CMakeFiles/llamamodel-mainline-default.dir/llmodel_shared.cpp.o"  libllama-mainline-default.a -pthread 
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 14%] Built target llamamodel-mainline-default
localai-api-1  | make  -f CMakeFiles/replit-mainline-default.dir/build.make CMakeFiles/replit-mainline-default.dir/depend
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | cd /build/gpt4all/gpt4all-bindings/golang/buildllm && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm/CMakeFiles/replit-mainline-default.dir/DependInfo.cmake "--color="
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | make  -f CMakeFiles/replit-mainline-default.dir/build.make CMakeFiles/replit-mainline-default.dir/build
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 15%] Building CXX object CMakeFiles/replit-mainline-default.dir/replit.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"default\" -Dreplit_mainline_default_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-mainline -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/replit-mainline-default.dir/replit.cpp.o -MF CMakeFiles/replit-mainline-default.dir/replit.cpp.o.d -o CMakeFiles/replit-mainline-default.dir/replit.cpp.o -c /build/gpt4all/gpt4all-backend/replit.cpp
localai-api-1  | /build/gpt4all/gpt4all-backend/replit.cpp: In function 'bool replit_eval(const replit_model&, int, int, const std::vector<int>&, std::vector<float>&, size_t&)':
localai-api-1  | /build/gpt4all/gpt4all-backend/replit.cpp:555:52: warning: missing initializer for member 'ggml_cgraph::n_nodes' [-Wmissing-field-initializers]
localai-api-1  |   555 |     struct ggml_cgraph gf = {.n_threads = n_threads};
localai-api-1  |       |                                                    ^
localai-api-1  | /build/gpt4all/gpt4all-backend/replit.cpp:555:52: warning: missing initializer for member 'ggml_cgraph::n_leafs' [-Wmissing-field-initializers]
localai-api-1  | /build/gpt4all/gpt4all-backend/replit.cpp:555:52: warning: missing initializer for member 'ggml_cgraph::work_size' [-Wmissing-field-initializers]
localai-api-1  | /build/gpt4all/gpt4all-backend/replit.cpp:555:52: warning: missing initializer for member 'ggml_cgraph::work' [-Wmissing-field-initializers]
localai-api-1  | /build/gpt4all/gpt4all-backend/replit.cpp:555:52: warning: missing initializer for member 'ggml_cgraph::nodes' [-Wmissing-field-initializers]
localai-api-1  | /build/gpt4all/gpt4all-backend/replit.cpp:555:52: warning: missing initializer for member 'ggml_cgraph::grads' [-Wmissing-field-initializers]
localai-api-1  | /build/gpt4all/gpt4all-backend/replit.cpp:555:52: warning: missing initializer for member 'ggml_cgraph::leafs' [-Wmissing-field-initializers]
localai-api-1  | /build/gpt4all/gpt4all-backend/replit.cpp:555:52: warning: missing initializer for member 'ggml_cgraph::perf_runs' [-Wmissing-field-initializers]
localai-api-1  | /build/gpt4all/gpt4all-backend/replit.cpp:555:52: warning: missing initializer for member 'ggml_cgraph::perf_cycles' [-Wmissing-field-initializers]
localai-api-1  | /build/gpt4all/gpt4all-backend/replit.cpp:555:52: warning: missing initializer for member 'ggml_cgraph::perf_time_us' [-Wmissing-field-initializers]
localai-api-1  | [ 16%] Building CXX object CMakeFiles/replit-mainline-default.dir/utils.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"default\" -Dreplit_mainline_default_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-mainline -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/replit-mainline-default.dir/utils.cpp.o -MF CMakeFiles/replit-mainline-default.dir/utils.cpp.o.d -o CMakeFiles/replit-mainline-default.dir/utils.cpp.o -c /build/gpt4all/gpt4all-backend/utils.cpp
localai-api-1  | [ 17%] Building CXX object CMakeFiles/replit-mainline-default.dir/llmodel_shared.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"default\" -Dreplit_mainline_default_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-mainline -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/replit-mainline-default.dir/llmodel_shared.cpp.o -MF CMakeFiles/replit-mainline-default.dir/llmodel_shared.cpp.o.d -o CMakeFiles/replit-mainline-default.dir/llmodel_shared.cpp.o -c /build/gpt4all/gpt4all-backend/llmodel_shared.cpp
localai-api-1  | [ 18%] Linking CXX shared library libreplit-mainline-default.so
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/replit-mainline-default.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -fPIC -O3 -DNDEBUG -shared -Wl,-soname,libreplit-mainline-default.so -o libreplit-mainline-default.so "CMakeFiles/replit-mainline-default.dir/replit.cpp.o" "CMakeFiles/replit-mainline-default.dir/utils.cpp.o" "CMakeFiles/replit-mainline-default.dir/llmodel_shared.cpp.o"  libllama-mainline-default.a -pthread 
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 18%] Built target replit-mainline-default
localai-api-1  | make  -f CMakeFiles/llamamodel-230519-default.dir/build.make CMakeFiles/llamamodel-230519-default.dir/depend
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | cd /build/gpt4all/gpt4all-bindings/golang/buildllm && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm/CMakeFiles/llamamodel-230519-default.dir/DependInfo.cmake "--color="
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | make  -f CMakeFiles/llamamodel-230519-default.dir/build.make CMakeFiles/llamamodel-230519-default.dir/build
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 20%] Building CXX object CMakeFiles/llamamodel-230519-default.dir/llamamodel.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"default\" -DLLAMA_DATE=230519 -DLLAMA_VERSIONS===2 -Dllamamodel_230519_default_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-230519 -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/llamamodel-230519-default.dir/llamamodel.cpp.o -MF CMakeFiles/llamamodel-230519-default.dir/llamamodel.cpp.o.d -o CMakeFiles/llamamodel-230519-default.dir/llamamodel.cpp.o -c /build/gpt4all/gpt4all-backend/llamamodel.cpp
localai-api-1  | [ 21%] Building CXX object CMakeFiles/llamamodel-230519-default.dir/llmodel_shared.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"default\" -DLLAMA_DATE=230519 -DLLAMA_VERSIONS===2 -Dllamamodel_230519_default_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-230519 -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/llamamodel-230519-default.dir/llmodel_shared.cpp.o -MF CMakeFiles/llamamodel-230519-default.dir/llmodel_shared.cpp.o.d -o CMakeFiles/llamamodel-230519-default.dir/llmodel_shared.cpp.o -c /build/gpt4all/gpt4all-backend/llmodel_shared.cpp
localai-api-1  | [ 22%] Linking CXX shared library libllamamodel-230519-default.so
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/llamamodel-230519-default.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -fPIC -O3 -DNDEBUG -shared -Wl,-soname,libllamamodel-230519-default.so -o libllamamodel-230519-default.so "CMakeFiles/llamamodel-230519-default.dir/llamamodel.cpp.o" "CMakeFiles/llamamodel-230519-default.dir/llmodel_shared.cpp.o"  libllama-230519-default.a -pthread 
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 22%] Built target llamamodel-230519-default
localai-api-1  | make  -f CMakeFiles/llamamodel-230511-default.dir/build.make CMakeFiles/llamamodel-230511-default.dir/depend
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | cd /build/gpt4all/gpt4all-bindings/golang/buildllm && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm/CMakeFiles/llamamodel-230511-default.dir/DependInfo.cmake "--color="
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | make  -f CMakeFiles/llamamodel-230511-default.dir/build.make CMakeFiles/llamamodel-230511-default.dir/build
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 23%] Building CXX object CMakeFiles/llamamodel-230511-default.dir/llamamodel.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"default\" -DLLAMA_DATE=230511 -DLLAMA_VERSIONS="<=1" -Dllamamodel_230511_default_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-230511 -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/llamamodel-230511-default.dir/llamamodel.cpp.o -MF CMakeFiles/llamamodel-230511-default.dir/llamamodel.cpp.o.d -o CMakeFiles/llamamodel-230511-default.dir/llamamodel.cpp.o -c /build/gpt4all/gpt4all-backend/llamamodel.cpp
localai-api-1  | [ 24%] Building CXX object CMakeFiles/llamamodel-230511-default.dir/llmodel_shared.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"default\" -DLLAMA_DATE=230511 -DLLAMA_VERSIONS="<=1" -Dllamamodel_230511_default_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-230511 -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/llamamodel-230511-default.dir/llmodel_shared.cpp.o -MF CMakeFiles/llamamodel-230511-default.dir/llmodel_shared.cpp.o.d -o CMakeFiles/llamamodel-230511-default.dir/llmodel_shared.cpp.o -c /build/gpt4all/gpt4all-backend/llmodel_shared.cpp
localai-api-1  | [ 25%] Linking CXX shared library libllamamodel-230511-default.so
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/llamamodel-230511-default.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -fPIC -O3 -DNDEBUG -shared -Wl,-soname,libllamamodel-230511-default.so -o libllamamodel-230511-default.so "CMakeFiles/llamamodel-230511-default.dir/llamamodel.cpp.o" "CMakeFiles/llamamodel-230511-default.dir/llmodel_shared.cpp.o"  libllama-230511-default.a -pthread 
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 25%] Built target llamamodel-230511-default
localai-api-1  | make  -f CMakeFiles/gptj-default.dir/build.make CMakeFiles/gptj-default.dir/depend
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | cd /build/gpt4all/gpt4all-bindings/golang/buildllm && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm/CMakeFiles/gptj-default.dir/DependInfo.cmake "--color="
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | make  -f CMakeFiles/gptj-default.dir/build.make CMakeFiles/gptj-default.dir/build
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 26%] Building CXX object CMakeFiles/gptj-default.dir/gptj.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"default\" -Dgptj_default_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-230511 -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -MD -MT CMakeFiles/gptj-default.dir/gptj.cpp.o -MF CMakeFiles/gptj-default.dir/gptj.cpp.o.d -o CMakeFiles/gptj-default.dir/gptj.cpp.o -c /build/gpt4all/gpt4all-backend/gptj.cpp
localai-api-1  | [ 27%] Building CXX object CMakeFiles/gptj-default.dir/utils.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"default\" -Dgptj_default_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-230511 -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -MD -MT CMakeFiles/gptj-default.dir/utils.cpp.o -MF CMakeFiles/gptj-default.dir/utils.cpp.o.d -o CMakeFiles/gptj-default.dir/utils.cpp.o -c /build/gpt4all/gpt4all-backend/utils.cpp
localai-api-1  | [ 28%] Building CXX object CMakeFiles/gptj-default.dir/llmodel_shared.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"default\" -Dgptj_default_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-230511 -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -MD -MT CMakeFiles/gptj-default.dir/llmodel_shared.cpp.o -MF CMakeFiles/gptj-default.dir/llmodel_shared.cpp.o.d -o CMakeFiles/gptj-default.dir/llmodel_shared.cpp.o -c /build/gpt4all/gpt4all-backend/llmodel_shared.cpp
localai-api-1  | [ 30%] Linking CXX shared library libgptj-default.so
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/gptj-default.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -fPIC -O3 -DNDEBUG -shared -Wl,-soname,libgptj-default.so -o libgptj-default.so "CMakeFiles/gptj-default.dir/gptj.cpp.o" "CMakeFiles/gptj-default.dir/utils.cpp.o" "CMakeFiles/gptj-default.dir/llmodel_shared.cpp.o" "CMakeFiles/ggml-230511-default.dir/llama.cpp-230511/ggml.c.o"  -pthread 
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 30%] Built target gptj-default
localai-api-1  | make  -f CMakeFiles/falcon-default.dir/build.make CMakeFiles/falcon-default.dir/depend
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | cd /build/gpt4all/gpt4all-bindings/golang/buildllm && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm/CMakeFiles/falcon-default.dir/DependInfo.cmake "--color="
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | make  -f CMakeFiles/falcon-default.dir/build.make CMakeFiles/falcon-default.dir/build
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 31%] Building CXX object CMakeFiles/falcon-default.dir/falcon.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"default\" -Dfalcon_default_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-mainline -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/falcon-default.dir/falcon.cpp.o -MF CMakeFiles/falcon-default.dir/falcon.cpp.o.d -o CMakeFiles/falcon-default.dir/falcon.cpp.o -c /build/gpt4all/gpt4all-backend/falcon.cpp
localai-api-1  | /build/gpt4all/gpt4all-backend/falcon.cpp: In function 'bool falcon_model_load(const string&, falcon_model&, gpt_vocab&, size_t*)':
localai-api-1  | /build/gpt4all/gpt4all-backend/falcon.cpp:199:19: warning: unused variable 'n_ctx' [-Wunused-variable]
localai-api-1  |   199 |         const int n_ctx = hparams.n_ctx;
localai-api-1  |       |                   ^~~~~
localai-api-1  | /build/gpt4all/gpt4all-backend/falcon.cpp:340:19: warning: unused variable 'n_head_kv' [-Wunused-variable]
localai-api-1  |   340 |         const int n_head_kv = hparams.n_head_kv;
localai-api-1  |       |                   ^~~~~~~~~
localai-api-1  | /build/gpt4all/gpt4all-backend/falcon.cpp:344:23: warning: unused variable 'n_elements' [-Wunused-variable]
localai-api-1  |   344 |         const int64_t n_elements = head_dim*n_mem;
localai-api-1  |       |                       ^~~~~~~~~~
localai-api-1  | /build/gpt4all/gpt4all-backend/falcon.cpp: In function 'bool falcon_eval(const falcon_model&, int, int, const std::vector<int>&, std::vector<float>&, size_t&)':
localai-api-1  | /build/gpt4all/gpt4all-backend/falcon.cpp:465:15: warning: unused variable 'version' [-Wunused-variable]
localai-api-1  |   465 |     const int version = hparams.falcon_version;
localai-api-1  |       |               ^~~~~~~
localai-api-1  | [ 32%] Building CXX object CMakeFiles/falcon-default.dir/utils.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"default\" -Dfalcon_default_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-mainline -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/falcon-default.dir/utils.cpp.o -MF CMakeFiles/falcon-default.dir/utils.cpp.o.d -o CMakeFiles/falcon-default.dir/utils.cpp.o -c /build/gpt4all/gpt4all-backend/utils.cpp
localai-api-1  | [ 33%] Building CXX object CMakeFiles/falcon-default.dir/llmodel_shared.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"default\" -Dfalcon_default_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-mainline -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/falcon-default.dir/llmodel_shared.cpp.o -MF CMakeFiles/falcon-default.dir/llmodel_shared.cpp.o.d -o CMakeFiles/falcon-default.dir/llmodel_shared.cpp.o -c /build/gpt4all/gpt4all-backend/llmodel_shared.cpp
localai-api-1  | [ 34%] Linking CXX shared library libfalcon-default.so
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/falcon-default.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -fPIC -O3 -DNDEBUG -shared -Wl,-soname,libfalcon-default.so -o libfalcon-default.so "CMakeFiles/falcon-default.dir/falcon.cpp.o" "CMakeFiles/falcon-default.dir/utils.cpp.o" "CMakeFiles/falcon-default.dir/llmodel_shared.cpp.o"  libllama-mainline-default.a -pthread 
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 34%] Built target falcon-default
localai-api-1  | make  -f CMakeFiles/mpt-default.dir/build.make CMakeFiles/mpt-default.dir/depend
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | cd /build/gpt4all/gpt4all-bindings/golang/buildllm && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm/CMakeFiles/mpt-default.dir/DependInfo.cmake "--color="
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | make  -f CMakeFiles/mpt-default.dir/build.make CMakeFiles/mpt-default.dir/build
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 35%] Building CXX object CMakeFiles/mpt-default.dir/mpt.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"default\" -Dmpt_default_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-230511 -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -MD -MT CMakeFiles/mpt-default.dir/mpt.cpp.o -MF CMakeFiles/mpt-default.dir/mpt.cpp.o.d -o CMakeFiles/mpt-default.dir/mpt.cpp.o -c /build/gpt4all/gpt4all-backend/mpt.cpp
localai-api-1  | [ 36%] Building CXX object CMakeFiles/mpt-default.dir/utils.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"default\" -Dmpt_default_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-230511 -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -MD -MT CMakeFiles/mpt-default.dir/utils.cpp.o -MF CMakeFiles/mpt-default.dir/utils.cpp.o.d -o CMakeFiles/mpt-default.dir/utils.cpp.o -c /build/gpt4all/gpt4all-backend/utils.cpp
localai-api-1  | [ 37%] Building CXX object CMakeFiles/mpt-default.dir/llmodel_shared.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"default\" -Dmpt_default_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-230511 -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -MD -MT CMakeFiles/mpt-default.dir/llmodel_shared.cpp.o -MF CMakeFiles/mpt-default.dir/llmodel_shared.cpp.o.d -o CMakeFiles/mpt-default.dir/llmodel_shared.cpp.o -c /build/gpt4all/gpt4all-backend/llmodel_shared.cpp
localai-api-1  | [ 38%] Linking CXX shared library libmpt-default.so
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/mpt-default.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -fPIC -O3 -DNDEBUG -shared -Wl,-soname,libmpt-default.so -o libmpt-default.so "CMakeFiles/mpt-default.dir/mpt.cpp.o" "CMakeFiles/mpt-default.dir/utils.cpp.o" "CMakeFiles/mpt-default.dir/llmodel_shared.cpp.o" "CMakeFiles/ggml-230511-default.dir/llama.cpp-230511/ggml.c.o"  -pthread 
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 38%] Built target mpt-default
localai-api-1  | make  -f CMakeFiles/bert-default.dir/build.make CMakeFiles/bert-default.dir/depend
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | cd /build/gpt4all/gpt4all-bindings/golang/buildllm && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm/CMakeFiles/bert-default.dir/DependInfo.cmake "--color="
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | make  -f CMakeFiles/bert-default.dir/build.make CMakeFiles/bert-default.dir/build
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 40%] Building CXX object CMakeFiles/bert-default.dir/bert.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"default\" -Dbert_default_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-mainline -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/bert-default.dir/bert.cpp.o -MF CMakeFiles/bert-default.dir/bert.cpp.o.d -o CMakeFiles/bert-default.dir/bert.cpp.o -c /build/gpt4all/gpt4all-backend/bert.cpp
localai-api-1  | [ 41%] Building CXX object CMakeFiles/bert-default.dir/utils.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"default\" -Dbert_default_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-mainline -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/bert-default.dir/utils.cpp.o -MF CMakeFiles/bert-default.dir/utils.cpp.o.d -o CMakeFiles/bert-default.dir/utils.cpp.o -c /build/gpt4all/gpt4all-backend/utils.cpp
localai-api-1  | [ 42%] Building CXX object CMakeFiles/bert-default.dir/llmodel_shared.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"default\" -Dbert_default_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-mainline -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/bert-default.dir/llmodel_shared.cpp.o -MF CMakeFiles/bert-default.dir/llmodel_shared.cpp.o.d -o CMakeFiles/bert-default.dir/llmodel_shared.cpp.o -c /build/gpt4all/gpt4all-backend/llmodel_shared.cpp
localai-api-1  | [ 43%] Linking CXX shared library libbert-default.so
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/bert-default.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -fPIC -O3 -DNDEBUG -shared -Wl,-soname,libbert-default.so -o libbert-default.so "CMakeFiles/bert-default.dir/bert.cpp.o" "CMakeFiles/bert-default.dir/utils.cpp.o" "CMakeFiles/bert-default.dir/llmodel_shared.cpp.o"  libllama-mainline-default.a -pthread 
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 43%] Built target bert-default
localai-api-1  | make  -f CMakeFiles/starcoder-default.dir/build.make CMakeFiles/starcoder-default.dir/depend
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | cd /build/gpt4all/gpt4all-bindings/golang/buildllm && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm/CMakeFiles/starcoder-default.dir/DependInfo.cmake "--color="
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | make  -f CMakeFiles/starcoder-default.dir/build.make CMakeFiles/starcoder-default.dir/build
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 44%] Building CXX object CMakeFiles/starcoder-default.dir/starcoder.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"default\" -Dstarcoder_default_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-mainline -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/starcoder-default.dir/starcoder.cpp.o -MF CMakeFiles/starcoder-default.dir/starcoder.cpp.o.d -o CMakeFiles/starcoder-default.dir/starcoder.cpp.o -c /build/gpt4all/gpt4all-backend/starcoder.cpp
localai-api-1  | /build/gpt4all/gpt4all-backend/starcoder.cpp: In function 'bool starcoder_eval(const starcoder_model&, int, int, const std::vector<int>&, std::vector<float>&, size_t&)':
localai-api-1  | /build/gpt4all/gpt4all-backend/starcoder.cpp:470:18: warning: unused variable 'head_dim' [-Wunused-variable]
localai-api-1  |   470 |     const size_t head_dim = n_embd / n_head;
localai-api-1  |       |                  ^~~~~~~~
localai-api-1  | [ 45%] Building CXX object CMakeFiles/starcoder-default.dir/utils.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"default\" -Dstarcoder_default_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-mainline -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/starcoder-default.dir/utils.cpp.o -MF CMakeFiles/starcoder-default.dir/utils.cpp.o.d -o CMakeFiles/starcoder-default.dir/utils.cpp.o -c /build/gpt4all/gpt4all-backend/utils.cpp
localai-api-1  | [ 46%] Building CXX object CMakeFiles/starcoder-default.dir/llmodel_shared.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"default\" -Dstarcoder_default_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-mainline -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/starcoder-default.dir/llmodel_shared.cpp.o -MF CMakeFiles/starcoder-default.dir/llmodel_shared.cpp.o.d -o CMakeFiles/starcoder-default.dir/llmodel_shared.cpp.o -c /build/gpt4all/gpt4all-backend/llmodel_shared.cpp
localai-api-1  | [ 47%] Linking CXX shared library libstarcoder-default.so
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/starcoder-default.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -fPIC -O3 -DNDEBUG -shared -Wl,-soname,libstarcoder-default.so -o libstarcoder-default.so "CMakeFiles/starcoder-default.dir/starcoder.cpp.o" "CMakeFiles/starcoder-default.dir/utils.cpp.o" "CMakeFiles/starcoder-default.dir/llmodel_shared.cpp.o"  libllama-mainline-default.a -pthread 
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 47%] Built target starcoder-default
localai-api-1  | make  -f CMakeFiles/ggml-mainline-avxonly.dir/build.make CMakeFiles/ggml-mainline-avxonly.dir/depend
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | cd /build/gpt4all/gpt4all-bindings/golang/buildllm && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm/CMakeFiles/ggml-mainline-avxonly.dir/DependInfo.cmake "--color="
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | make  -f CMakeFiles/ggml-mainline-avxonly.dir/build.make CMakeFiles/ggml-mainline-avxonly.dir/build
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 48%] Building C object CMakeFiles/ggml-mainline-avxonly.dir/llama.cpp-mainline/ggml.c.o
localai-api-1  | /usr/bin/cc -DGGML_USE_K_QUANTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-mainline -O3 -DNDEBUG -std=gnu11 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wdouble-promotion -Wshadow -Wstrict-prototypes -Wpointer-arith -pthread -MD -MT CMakeFiles/ggml-mainline-avxonly.dir/llama.cpp-mainline/ggml.c.o -MF CMakeFiles/ggml-mainline-avxonly.dir/llama.cpp-mainline/ggml.c.o.d -o CMakeFiles/ggml-mainline-avxonly.dir/llama.cpp-mainline/ggml.c.o -c /build/gpt4all/gpt4all-backend/llama.cpp-mainline/ggml.c
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-mainline/ggml.c: In function 'quantize_row_q8_0':
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-mainline/ggml.c:1096:15: warning: unused variable 'nb' [-Wunused-variable]
localai-api-1  |  1096 |     const int nb = k / QK8_0;
localai-api-1  |       |               ^~
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-mainline/ggml.c: In function 'quantize_row_q8_1':
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-mainline/ggml.c:1291:15: warning: unused variable 'nb' [-Wunused-variable]
localai-api-1  |  1291 |     const int nb = k / QK8_1;
localai-api-1  |       |               ^~
localai-api-1  | [ 50%] Building C object CMakeFiles/ggml-mainline-avxonly.dir/llama.cpp-mainline/k_quants.c.o
localai-api-1  | /usr/bin/cc -DGGML_USE_K_QUANTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-mainline -O3 -DNDEBUG -std=gnu11 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wdouble-promotion -Wshadow -Wstrict-prototypes -Wpointer-arith -pthread -MD -MT CMakeFiles/ggml-mainline-avxonly.dir/llama.cpp-mainline/k_quants.c.o -MF CMakeFiles/ggml-mainline-avxonly.dir/llama.cpp-mainline/k_quants.c.o.d -o CMakeFiles/ggml-mainline-avxonly.dir/llama.cpp-mainline/k_quants.c.o -c /build/gpt4all/gpt4all-backend/llama.cpp-mainline/k_quants.c
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 50%] Built target ggml-mainline-avxonly
localai-api-1  | make  -f CMakeFiles/llama-mainline-avxonly.dir/build.make CMakeFiles/llama-mainline-avxonly.dir/depend
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | cd /build/gpt4all/gpt4all-bindings/golang/buildllm && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm/CMakeFiles/llama-mainline-avxonly.dir/DependInfo.cmake "--color="
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | make  -f CMakeFiles/llama-mainline-avxonly.dir/build.make CMakeFiles/llama-mainline-avxonly.dir/build
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 51%] Building CXX object CMakeFiles/llama-mainline-avxonly.dir/llama.cpp-mainline/llama.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_USE_K_QUANTS -DLLAMA_BUILD -DLLAMA_SHARED -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-mainline -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -MD -MT CMakeFiles/llama-mainline-avxonly.dir/llama.cpp-mainline/llama.cpp.o -MF CMakeFiles/llama-mainline-avxonly.dir/llama.cpp-mainline/llama.cpp.o.d -o CMakeFiles/llama-mainline-avxonly.dir/llama.cpp-mainline/llama.cpp.o -c /build/gpt4all/gpt4all-backend/llama.cpp-mainline/llama.cpp
localai-api-1  | [ 52%] Linking CXX static library libllama-mainline-avxonly.a
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -P CMakeFiles/llama-mainline-avxonly.dir/cmake_clean_target.cmake
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/llama-mainline-avxonly.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/ar qc libllama-mainline-avxonly.a "CMakeFiles/llama-mainline-avxonly.dir/llama.cpp-mainline/llama.cpp.o" "CMakeFiles/ggml-mainline-avxonly.dir/llama.cpp-mainline/ggml.c.o" "CMakeFiles/ggml-mainline-avxonly.dir/llama.cpp-mainline/k_quants.c.o"
localai-api-1  | /usr/bin/ranlib libllama-mainline-avxonly.a
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 52%] Built target llama-mainline-avxonly
localai-api-1  | make  -f CMakeFiles/ggml-230511-avxonly.dir/build.make CMakeFiles/ggml-230511-avxonly.dir/depend
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | cd /build/gpt4all/gpt4all-bindings/golang/buildllm && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm/CMakeFiles/ggml-230511-avxonly.dir/DependInfo.cmake "--color="
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | make  -f CMakeFiles/ggml-230511-avxonly.dir/build.make CMakeFiles/ggml-230511-avxonly.dir/build
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 53%] Building C object CMakeFiles/ggml-230511-avxonly.dir/llama.cpp-230511/ggml.c.o
localai-api-1  | /usr/bin/cc  -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-230511 -O3 -DNDEBUG -std=gnu11 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wdouble-promotion -Wshadow -Wstrict-prototypes -Wpointer-arith -pthread -MD -MT CMakeFiles/ggml-230511-avxonly.dir/llama.cpp-230511/ggml.c.o -MF CMakeFiles/ggml-230511-avxonly.dir/llama.cpp-230511/ggml.c.o.d -o CMakeFiles/ggml-230511-avxonly.dir/llama.cpp-230511/ggml.c.o -c /build/gpt4all/gpt4all-backend/llama.cpp-230511/ggml.c
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-230511/ggml.c: In function 'quantize_row_q4_0':
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-230511/ggml.c:781:15: warning: unused variable 'nb' [-Wunused-variable]
localai-api-1  |   781 |     const int nb = k / QK4_0;
localai-api-1  |       |               ^~
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-230511/ggml.c: In function 'quantize_row_q4_1':
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-230511/ggml.c:1129:27: warning: unused variable 'y' [-Wunused-variable]
localai-api-1  |  1129 |     block_q4_1 * restrict y = vy;
localai-api-1  |       |                           ^
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-230511/ggml.c:1127:15: warning: unused variable 'nb' [-Wunused-variable]
localai-api-1  |  1127 |     const int nb = k / QK4_1;
localai-api-1  |       |               ^~
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-230511/ggml.c: In function 'quantize_row_q8_1':
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-230511/ggml.c:1507:15: warning: unused variable 'nb' [-Wunused-variable]
localai-api-1  |  1507 |     const int nb = k / QK8_1;
localai-api-1  |       |               ^~
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-230511/ggml.c: In function 'ggml_compute_forward_alibi_f32':
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-230511/ggml.c:9357:15: warning: unused variable 'ne2_ne3' [-Wunused-variable]
localai-api-1  |  9357 |     const int ne2_ne3 = n/ne1; // ne2*ne3
localai-api-1  |       |               ^~~~~~~
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-230511/ggml.c: In function 'ggml_compute_forward_alibi_f16':
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-230511/ggml.c:9419:15: warning: unused variable 'ne2' [-Wunused-variable]
localai-api-1  |  9419 |     const int ne2 = src0->ne[2]; // n_head -> this is k
localai-api-1  |       |               ^~~
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-230511/ggml.c: In function 'ggml_compute_forward_alibi':
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-230511/ggml.c:9468:5: warning: enumeration value 'GGML_TYPE_Q4_3' not handled in switch [-Wswitch]
localai-api-1  |  9468 |     switch (src0->type) {
localai-api-1  |       |     ^~~~~~
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 53%] Built target ggml-230511-avxonly
localai-api-1  | make  -f CMakeFiles/llama-230511-avxonly.dir/build.make CMakeFiles/llama-230511-avxonly.dir/depend
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | cd /build/gpt4all/gpt4all-bindings/golang/buildllm && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm/CMakeFiles/llama-230511-avxonly.dir/DependInfo.cmake "--color="
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | make  -f CMakeFiles/llama-230511-avxonly.dir/build.make CMakeFiles/llama-230511-avxonly.dir/build
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 54%] Building CXX object CMakeFiles/llama-230511-avxonly.dir/llama.cpp-230511/llama.cpp.o
localai-api-1  | /usr/bin/c++ -DLLAMA_BUILD -DLLAMA_SHARED -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-230511 -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -MD -MT CMakeFiles/llama-230511-avxonly.dir/llama.cpp-230511/llama.cpp.o -MF CMakeFiles/llama-230511-avxonly.dir/llama.cpp-230511/llama.cpp.o.d -o CMakeFiles/llama-230511-avxonly.dir/llama.cpp-230511/llama.cpp.o -c /build/gpt4all/gpt4all-backend/llama.cpp-230511/llama.cpp
localai-api-1  | [ 55%] Linking CXX static library libllama-230511-avxonly.a
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -P CMakeFiles/llama-230511-avxonly.dir/cmake_clean_target.cmake
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/llama-230511-avxonly.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/ar qc libllama-230511-avxonly.a "CMakeFiles/llama-230511-avxonly.dir/llama.cpp-230511/llama.cpp.o" "CMakeFiles/ggml-230511-avxonly.dir/llama.cpp-230511/ggml.c.o"
localai-api-1  | /usr/bin/ranlib libllama-230511-avxonly.a
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 55%] Built target llama-230511-avxonly
localai-api-1  | make  -f CMakeFiles/ggml-230519-avxonly.dir/build.make CMakeFiles/ggml-230519-avxonly.dir/depend
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | cd /build/gpt4all/gpt4all-bindings/golang/buildllm && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm/CMakeFiles/ggml-230519-avxonly.dir/DependInfo.cmake "--color="
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | make  -f CMakeFiles/ggml-230519-avxonly.dir/build.make CMakeFiles/ggml-230519-avxonly.dir/build
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 56%] Building C object CMakeFiles/ggml-230519-avxonly.dir/llama.cpp-230519/ggml.c.o
localai-api-1  | /usr/bin/cc  -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-230519 -O3 -DNDEBUG -std=gnu11 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wdouble-promotion -Wshadow -Wstrict-prototypes -Wpointer-arith -pthread -MD -MT CMakeFiles/ggml-230519-avxonly.dir/llama.cpp-230519/ggml.c.o -MF CMakeFiles/ggml-230519-avxonly.dir/llama.cpp-230519/ggml.c.o.d -o CMakeFiles/ggml-230519-avxonly.dir/llama.cpp-230519/ggml.c.o -c /build/gpt4all/gpt4all-backend/llama.cpp-230519/ggml.c
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-230519/ggml.c: In function 'quantize_row_q8_0':
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-230519/ggml.c:1025:15: warning: unused variable 'nb' [-Wunused-variable]
localai-api-1  |  1025 |     const int nb = k / QK8_0;
localai-api-1  |       |               ^~
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-230519/ggml.c: In function 'quantize_row_q8_1':
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-230519/ggml.c:1187:15: warning: unused variable 'nb' [-Wunused-variable]
localai-api-1  |  1187 |     const int nb = k / QK8_1;
localai-api-1  |       |               ^~
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 56%] Built target ggml-230519-avxonly
localai-api-1  | make  -f CMakeFiles/llama-230519-avxonly.dir/build.make CMakeFiles/llama-230519-avxonly.dir/depend
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | cd /build/gpt4all/gpt4all-bindings/golang/buildllm && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm/CMakeFiles/llama-230519-avxonly.dir/DependInfo.cmake "--color="
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | make  -f CMakeFiles/llama-230519-avxonly.dir/build.make CMakeFiles/llama-230519-avxonly.dir/build
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 57%] Building CXX object CMakeFiles/llama-230519-avxonly.dir/llama.cpp-230519/llama.cpp.o
localai-api-1  | /usr/bin/c++ -DLLAMA_BUILD -DLLAMA_SHARED -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-230519 -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -MD -MT CMakeFiles/llama-230519-avxonly.dir/llama.cpp-230519/llama.cpp.o -MF CMakeFiles/llama-230519-avxonly.dir/llama.cpp-230519/llama.cpp.o.d -o CMakeFiles/llama-230519-avxonly.dir/llama.cpp-230519/llama.cpp.o -c /build/gpt4all/gpt4all-backend/llama.cpp-230519/llama.cpp
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-230519/llama.cpp: In function 'size_t llama_set_state_data(llama_context*, const uint8_t*)':
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-230519/llama.cpp:2685:27: warning: cast from type 'const uint8_t*' {aka 'const unsigned char*'} to type 'void*' casts away qualifiers [-Wcast-qual]
localai-api-1  |  2685 |             kin3d->data = (void *) inp;
localai-api-1  |       |                           ^~~~~~~~~~~~
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-230519/llama.cpp:2689:27: warning: cast from type 'const uint8_t*' {aka 'const unsigned char*'} to type 'void*' casts away qualifiers [-Wcast-qual]
localai-api-1  |  2689 |             vin3d->data = (void *) inp;
localai-api-1  |       |                           ^~~~~~~~~~~~
localai-api-1  | [ 58%] Linking CXX static library libllama-230519-avxonly.a
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -P CMakeFiles/llama-230519-avxonly.dir/cmake_clean_target.cmake
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/llama-230519-avxonly.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/ar qc libllama-230519-avxonly.a "CMakeFiles/llama-230519-avxonly.dir/llama.cpp-230519/llama.cpp.o" "CMakeFiles/ggml-230519-avxonly.dir/llama.cpp-230519/ggml.c.o"
localai-api-1  | /usr/bin/ranlib libllama-230519-avxonly.a
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 58%] Built target llama-230519-avxonly
localai-api-1  | make  -f CMakeFiles/llamamodel-mainline-avxonly.dir/build.make CMakeFiles/llamamodel-mainline-avxonly.dir/depend
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | cd /build/gpt4all/gpt4all-bindings/golang/buildllm && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm/CMakeFiles/llamamodel-mainline-avxonly.dir/DependInfo.cmake "--color="
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | make  -f CMakeFiles/llamamodel-mainline-avxonly.dir/build.make CMakeFiles/llamamodel-mainline-avxonly.dir/build
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 60%] Building CXX object CMakeFiles/llamamodel-mainline-avxonly.dir/llamamodel.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"avxonly\" -DLLAMA_DATE=999999 -DLLAMA_VERSIONS=">=3" -Dllamamodel_mainline_avxonly_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-mainline -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/llamamodel-mainline-avxonly.dir/llamamodel.cpp.o -MF CMakeFiles/llamamodel-mainline-avxonly.dir/llamamodel.cpp.o.d -o CMakeFiles/llamamodel-mainline-avxonly.dir/llamamodel.cpp.o -c /build/gpt4all/gpt4all-backend/llamamodel.cpp
localai-api-1  | /build/gpt4all/gpt4all-backend/llamamodel.cpp: In member function 'virtual bool LLamaModel::loadModel(const string&)':
localai-api-1  | /build/gpt4all/gpt4all-backend/llamamodel.cpp:159:71: warning: 'llama_context* llama_init_from_file(const char*, llama_context_params)' is deprecated: please use llama_load_model_from_file combined with llama_new_context_with_model instead [-Wdeprecated-declarations]
localai-api-1  |   159 |     d_ptr->ctx = llama_init_from_file(modelPath.c_str(), d_ptr->params);
localai-api-1  |       |                                                                       ^
localai-api-1  | In file included from /build/gpt4all/gpt4all-backend/llamamodel.cpp:28:
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-mainline/llama.h:161:49: note: declared here
localai-api-1  |   161 |     LLAMA_API DEPRECATED(struct llama_context * llama_init_from_file(
localai-api-1  |       |                                                 ^~~~~~~~~~~~~~~~~~~~
localai-api-1  | /build/gpt4all/gpt4all-backend/llama.cpp-mainline/llama.h:30:36: note: in definition of macro 'DEPRECATED'
localai-api-1  |    30 | #    define DEPRECATED(func, hint) func __attribute__((deprecated(hint)))
localai-api-1  |       |                                    ^~~~
localai-api-1  | [ 61%] Building CXX object CMakeFiles/llamamodel-mainline-avxonly.dir/llmodel_shared.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"avxonly\" -DLLAMA_DATE=999999 -DLLAMA_VERSIONS=">=3" -Dllamamodel_mainline_avxonly_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-mainline -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/llamamodel-mainline-avxonly.dir/llmodel_shared.cpp.o -MF CMakeFiles/llamamodel-mainline-avxonly.dir/llmodel_shared.cpp.o.d -o CMakeFiles/llamamodel-mainline-avxonly.dir/llmodel_shared.cpp.o -c /build/gpt4all/gpt4all-backend/llmodel_shared.cpp
localai-api-1  | [ 62%] Linking CXX shared library libllamamodel-mainline-avxonly.so
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/llamamodel-mainline-avxonly.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -fPIC -O3 -DNDEBUG -shared -Wl,-soname,libllamamodel-mainline-avxonly.so -o libllamamodel-mainline-avxonly.so "CMakeFiles/llamamodel-mainline-avxonly.dir/llamamodel.cpp.o" "CMakeFiles/llamamodel-mainline-avxonly.dir/llmodel_shared.cpp.o"  libllama-mainline-avxonly.a -pthread 
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 62%] Built target llamamodel-mainline-avxonly
localai-api-1  | make  -f CMakeFiles/replit-mainline-avxonly.dir/build.make CMakeFiles/replit-mainline-avxonly.dir/depend
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | cd /build/gpt4all/gpt4all-bindings/golang/buildllm && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm/CMakeFiles/replit-mainline-avxonly.dir/DependInfo.cmake "--color="
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | make  -f CMakeFiles/replit-mainline-avxonly.dir/build.make CMakeFiles/replit-mainline-avxonly.dir/build
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 63%] Building CXX object CMakeFiles/replit-mainline-avxonly.dir/replit.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"avxonly\" -Dreplit_mainline_avxonly_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-mainline -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/replit-mainline-avxonly.dir/replit.cpp.o -MF CMakeFiles/replit-mainline-avxonly.dir/replit.cpp.o.d -o CMakeFiles/replit-mainline-avxonly.dir/replit.cpp.o -c /build/gpt4all/gpt4all-backend/replit.cpp
localai-api-1  | /build/gpt4all/gpt4all-backend/replit.cpp: In function 'bool replit_eval(const replit_model&, int, int, const std::vector<int>&, std::vector<float>&, size_t&)':
localai-api-1  | /build/gpt4all/gpt4all-backend/replit.cpp:555:52: warning: missing initializer for member 'ggml_cgraph::n_nodes' [-Wmissing-field-initializers]
localai-api-1  |   555 |     struct ggml_cgraph gf = {.n_threads = n_threads};
localai-api-1  |       |                                                    ^
localai-api-1  | /build/gpt4all/gpt4all-backend/replit.cpp:555:52: warning: missing initializer for member 'ggml_cgraph::n_leafs' [-Wmissing-field-initializers]
localai-api-1  | /build/gpt4all/gpt4all-backend/replit.cpp:555:52: warning: missing initializer for member 'ggml_cgraph::work_size' [-Wmissing-field-initializers]
localai-api-1  | /build/gpt4all/gpt4all-backend/replit.cpp:555:52: warning: missing initializer for member 'ggml_cgraph::work' [-Wmissing-field-initializers]
localai-api-1  | /build/gpt4all/gpt4all-backend/replit.cpp:555:52: warning: missing initializer for member 'ggml_cgraph::nodes' [-Wmissing-field-initializers]
localai-api-1  | /build/gpt4all/gpt4all-backend/replit.cpp:555:52: warning: missing initializer for member 'ggml_cgraph::grads' [-Wmissing-field-initializers]
localai-api-1  | /build/gpt4all/gpt4all-backend/replit.cpp:555:52: warning: missing initializer for member 'ggml_cgraph::leafs' [-Wmissing-field-initializers]
localai-api-1  | /build/gpt4all/gpt4all-backend/replit.cpp:555:52: warning: missing initializer for member 'ggml_cgraph::perf_runs' [-Wmissing-field-initializers]
localai-api-1  | /build/gpt4all/gpt4all-backend/replit.cpp:555:52: warning: missing initializer for member 'ggml_cgraph::perf_cycles' [-Wmissing-field-initializers]
localai-api-1  | /build/gpt4all/gpt4all-backend/replit.cpp:555:52: warning: missing initializer for member 'ggml_cgraph::perf_time_us' [-Wmissing-field-initializers]
localai-api-1  | [ 64%] Building CXX object CMakeFiles/replit-mainline-avxonly.dir/utils.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"avxonly\" -Dreplit_mainline_avxonly_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-mainline -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/replit-mainline-avxonly.dir/utils.cpp.o -MF CMakeFiles/replit-mainline-avxonly.dir/utils.cpp.o.d -o CMakeFiles/replit-mainline-avxonly.dir/utils.cpp.o -c /build/gpt4all/gpt4all-backend/utils.cpp
localai-api-1  | [ 65%] Building CXX object CMakeFiles/replit-mainline-avxonly.dir/llmodel_shared.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"avxonly\" -Dreplit_mainline_avxonly_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-mainline -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/replit-mainline-avxonly.dir/llmodel_shared.cpp.o -MF CMakeFiles/replit-mainline-avxonly.dir/llmodel_shared.cpp.o.d -o CMakeFiles/replit-mainline-avxonly.dir/llmodel_shared.cpp.o -c /build/gpt4all/gpt4all-backend/llmodel_shared.cpp
localai-api-1  | [ 66%] Linking CXX shared library libreplit-mainline-avxonly.so
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/replit-mainline-avxonly.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -fPIC -O3 -DNDEBUG -shared -Wl,-soname,libreplit-mainline-avxonly.so -o libreplit-mainline-avxonly.so "CMakeFiles/replit-mainline-avxonly.dir/replit.cpp.o" "CMakeFiles/replit-mainline-avxonly.dir/utils.cpp.o" "CMakeFiles/replit-mainline-avxonly.dir/llmodel_shared.cpp.o"  libllama-mainline-avxonly.a -pthread 
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 66%] Built target replit-mainline-avxonly
localai-api-1  | make  -f CMakeFiles/llamamodel-230519-avxonly.dir/build.make CMakeFiles/llamamodel-230519-avxonly.dir/depend
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | cd /build/gpt4all/gpt4all-bindings/golang/buildllm && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm/CMakeFiles/llamamodel-230519-avxonly.dir/DependInfo.cmake "--color="
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | make  -f CMakeFiles/llamamodel-230519-avxonly.dir/build.make CMakeFiles/llamamodel-230519-avxonly.dir/build
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 67%] Building CXX object CMakeFiles/llamamodel-230519-avxonly.dir/llamamodel.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"avxonly\" -DLLAMA_DATE=230519 -DLLAMA_VERSIONS===2 -Dllamamodel_230519_avxonly_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-230519 -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/llamamodel-230519-avxonly.dir/llamamodel.cpp.o -MF CMakeFiles/llamamodel-230519-avxonly.dir/llamamodel.cpp.o.d -o CMakeFiles/llamamodel-230519-avxonly.dir/llamamodel.cpp.o -c /build/gpt4all/gpt4all-backend/llamamodel.cpp
localai-api-1  | [ 68%] Building CXX object CMakeFiles/llamamodel-230519-avxonly.dir/llmodel_shared.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"avxonly\" -DLLAMA_DATE=230519 -DLLAMA_VERSIONS===2 -Dllamamodel_230519_avxonly_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-230519 -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/llamamodel-230519-avxonly.dir/llmodel_shared.cpp.o -MF CMakeFiles/llamamodel-230519-avxonly.dir/llmodel_shared.cpp.o.d -o CMakeFiles/llamamodel-230519-avxonly.dir/llmodel_shared.cpp.o -c /build/gpt4all/gpt4all-backend/llmodel_shared.cpp
localai-api-1  | [ 70%] Linking CXX shared library libllamamodel-230519-avxonly.so
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/llamamodel-230519-avxonly.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -fPIC -O3 -DNDEBUG -shared -Wl,-soname,libllamamodel-230519-avxonly.so -o libllamamodel-230519-avxonly.so "CMakeFiles/llamamodel-230519-avxonly.dir/llamamodel.cpp.o" "CMakeFiles/llamamodel-230519-avxonly.dir/llmodel_shared.cpp.o"  libllama-230519-avxonly.a -pthread 
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 70%] Built target llamamodel-230519-avxonly
localai-api-1  | make  -f CMakeFiles/llamamodel-230511-avxonly.dir/build.make CMakeFiles/llamamodel-230511-avxonly.dir/depend
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | cd /build/gpt4all/gpt4all-bindings/golang/buildllm && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm/CMakeFiles/llamamodel-230511-avxonly.dir/DependInfo.cmake "--color="
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | make  -f CMakeFiles/llamamodel-230511-avxonly.dir/build.make CMakeFiles/llamamodel-230511-avxonly.dir/build
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 71%] Building CXX object CMakeFiles/llamamodel-230511-avxonly.dir/llamamodel.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"avxonly\" -DLLAMA_DATE=230511 -DLLAMA_VERSIONS="<=1" -Dllamamodel_230511_avxonly_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-230511 -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/llamamodel-230511-avxonly.dir/llamamodel.cpp.o -MF CMakeFiles/llamamodel-230511-avxonly.dir/llamamodel.cpp.o.d -o CMakeFiles/llamamodel-230511-avxonly.dir/llamamodel.cpp.o -c /build/gpt4all/gpt4all-backend/llamamodel.cpp
localai-api-1  | [ 72%] Building CXX object CMakeFiles/llamamodel-230511-avxonly.dir/llmodel_shared.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"avxonly\" -DLLAMA_DATE=230511 -DLLAMA_VERSIONS="<=1" -Dllamamodel_230511_avxonly_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-230511 -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/llamamodel-230511-avxonly.dir/llmodel_shared.cpp.o -MF CMakeFiles/llamamodel-230511-avxonly.dir/llmodel_shared.cpp.o.d -o CMakeFiles/llamamodel-230511-avxonly.dir/llmodel_shared.cpp.o -c /build/gpt4all/gpt4all-backend/llmodel_shared.cpp
localai-api-1  | [ 73%] Linking CXX shared library libllamamodel-230511-avxonly.so
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/llamamodel-230511-avxonly.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -fPIC -O3 -DNDEBUG -shared -Wl,-soname,libllamamodel-230511-avxonly.so -o libllamamodel-230511-avxonly.so "CMakeFiles/llamamodel-230511-avxonly.dir/llamamodel.cpp.o" "CMakeFiles/llamamodel-230511-avxonly.dir/llmodel_shared.cpp.o"  libllama-230511-avxonly.a -pthread 
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 73%] Built target llamamodel-230511-avxonly
localai-api-1  | make  -f CMakeFiles/gptj-avxonly.dir/build.make CMakeFiles/gptj-avxonly.dir/depend
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | cd /build/gpt4all/gpt4all-bindings/golang/buildllm && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm/CMakeFiles/gptj-avxonly.dir/DependInfo.cmake "--color="
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | make  -f CMakeFiles/gptj-avxonly.dir/build.make CMakeFiles/gptj-avxonly.dir/build
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 74%] Building CXX object CMakeFiles/gptj-avxonly.dir/gptj.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"avxonly\" -Dgptj_avxonly_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-230511 -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -MD -MT CMakeFiles/gptj-avxonly.dir/gptj.cpp.o -MF CMakeFiles/gptj-avxonly.dir/gptj.cpp.o.d -o CMakeFiles/gptj-avxonly.dir/gptj.cpp.o -c /build/gpt4all/gpt4all-backend/gptj.cpp
localai-api-1  | [ 75%] Building CXX object CMakeFiles/gptj-avxonly.dir/utils.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"avxonly\" -Dgptj_avxonly_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-230511 -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -MD -MT CMakeFiles/gptj-avxonly.dir/utils.cpp.o -MF CMakeFiles/gptj-avxonly.dir/utils.cpp.o.d -o CMakeFiles/gptj-avxonly.dir/utils.cpp.o -c /build/gpt4all/gpt4all-backend/utils.cpp
localai-api-1  | [ 76%] Building CXX object CMakeFiles/gptj-avxonly.dir/llmodel_shared.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"avxonly\" -Dgptj_avxonly_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-230511 -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -MD -MT CMakeFiles/gptj-avxonly.dir/llmodel_shared.cpp.o -MF CMakeFiles/gptj-avxonly.dir/llmodel_shared.cpp.o.d -o CMakeFiles/gptj-avxonly.dir/llmodel_shared.cpp.o -c /build/gpt4all/gpt4all-backend/llmodel_shared.cpp
localai-api-1  | [ 77%] Linking CXX shared library libgptj-avxonly.so
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/gptj-avxonly.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -fPIC -O3 -DNDEBUG -shared -Wl,-soname,libgptj-avxonly.so -o libgptj-avxonly.so "CMakeFiles/gptj-avxonly.dir/gptj.cpp.o" "CMakeFiles/gptj-avxonly.dir/utils.cpp.o" "CMakeFiles/gptj-avxonly.dir/llmodel_shared.cpp.o" "CMakeFiles/ggml-230511-avxonly.dir/llama.cpp-230511/ggml.c.o"  -pthread 
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 77%] Built target gptj-avxonly
localai-api-1  | make  -f CMakeFiles/falcon-avxonly.dir/build.make CMakeFiles/falcon-avxonly.dir/depend
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | cd /build/gpt4all/gpt4all-bindings/golang/buildllm && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm/CMakeFiles/falcon-avxonly.dir/DependInfo.cmake "--color="
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | make  -f CMakeFiles/falcon-avxonly.dir/build.make CMakeFiles/falcon-avxonly.dir/build
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 78%] Building CXX object CMakeFiles/falcon-avxonly.dir/falcon.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"avxonly\" -Dfalcon_avxonly_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-mainline -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/falcon-avxonly.dir/falcon.cpp.o -MF CMakeFiles/falcon-avxonly.dir/falcon.cpp.o.d -o CMakeFiles/falcon-avxonly.dir/falcon.cpp.o -c /build/gpt4all/gpt4all-backend/falcon.cpp
localai-api-1  | /build/gpt4all/gpt4all-backend/falcon.cpp: In function 'bool falcon_model_load(const string&, falcon_model&, gpt_vocab&, size_t*)':
localai-api-1  | /build/gpt4all/gpt4all-backend/falcon.cpp:199:19: warning: unused variable 'n_ctx' [-Wunused-variable]
localai-api-1  |   199 |         const int n_ctx = hparams.n_ctx;
localai-api-1  |       |                   ^~~~~
localai-api-1  | /build/gpt4all/gpt4all-backend/falcon.cpp:340:19: warning: unused variable 'n_head_kv' [-Wunused-variable]
localai-api-1  |   340 |         const int n_head_kv = hparams.n_head_kv;
localai-api-1  |       |                   ^~~~~~~~~
localai-api-1  | /build/gpt4all/gpt4all-backend/falcon.cpp:344:23: warning: unused variable 'n_elements' [-Wunused-variable]
localai-api-1  |   344 |         const int64_t n_elements = head_dim*n_mem;
localai-api-1  |       |                       ^~~~~~~~~~
localai-api-1  | /build/gpt4all/gpt4all-backend/falcon.cpp: In function 'bool falcon_eval(const falcon_model&, int, int, const std::vector<int>&, std::vector<float>&, size_t&)':
localai-api-1  | /build/gpt4all/gpt4all-backend/falcon.cpp:465:15: warning: unused variable 'version' [-Wunused-variable]
localai-api-1  |   465 |     const int version = hparams.falcon_version;
localai-api-1  |       |               ^~~~~~~
localai-api-1  | [ 80%] Building CXX object CMakeFiles/falcon-avxonly.dir/utils.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"avxonly\" -Dfalcon_avxonly_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-mainline -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/falcon-avxonly.dir/utils.cpp.o -MF CMakeFiles/falcon-avxonly.dir/utils.cpp.o.d -o CMakeFiles/falcon-avxonly.dir/utils.cpp.o -c /build/gpt4all/gpt4all-backend/utils.cpp
localai-api-1  | [ 81%] Building CXX object CMakeFiles/falcon-avxonly.dir/llmodel_shared.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"avxonly\" -Dfalcon_avxonly_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-mainline -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/falcon-avxonly.dir/llmodel_shared.cpp.o -MF CMakeFiles/falcon-avxonly.dir/llmodel_shared.cpp.o.d -o CMakeFiles/falcon-avxonly.dir/llmodel_shared.cpp.o -c /build/gpt4all/gpt4all-backend/llmodel_shared.cpp
localai-api-1  | [ 82%] Linking CXX shared library libfalcon-avxonly.so
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/falcon-avxonly.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -fPIC -O3 -DNDEBUG -shared -Wl,-soname,libfalcon-avxonly.so -o libfalcon-avxonly.so "CMakeFiles/falcon-avxonly.dir/falcon.cpp.o" "CMakeFiles/falcon-avxonly.dir/utils.cpp.o" "CMakeFiles/falcon-avxonly.dir/llmodel_shared.cpp.o"  libllama-mainline-avxonly.a -pthread 
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 82%] Built target falcon-avxonly
localai-api-1  | make  -f CMakeFiles/mpt-avxonly.dir/build.make CMakeFiles/mpt-avxonly.dir/depend
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | cd /build/gpt4all/gpt4all-bindings/golang/buildllm && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm/CMakeFiles/mpt-avxonly.dir/DependInfo.cmake "--color="
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | make  -f CMakeFiles/mpt-avxonly.dir/build.make CMakeFiles/mpt-avxonly.dir/build
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 83%] Building CXX object CMakeFiles/mpt-avxonly.dir/mpt.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"avxonly\" -Dmpt_avxonly_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-230511 -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -MD -MT CMakeFiles/mpt-avxonly.dir/mpt.cpp.o -MF CMakeFiles/mpt-avxonly.dir/mpt.cpp.o.d -o CMakeFiles/mpt-avxonly.dir/mpt.cpp.o -c /build/gpt4all/gpt4all-backend/mpt.cpp
localai-api-1  | [ 84%] Building CXX object CMakeFiles/mpt-avxonly.dir/utils.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"avxonly\" -Dmpt_avxonly_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-230511 -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -MD -MT CMakeFiles/mpt-avxonly.dir/utils.cpp.o -MF CMakeFiles/mpt-avxonly.dir/utils.cpp.o.d -o CMakeFiles/mpt-avxonly.dir/utils.cpp.o -c /build/gpt4all/gpt4all-backend/utils.cpp
localai-api-1  | [ 85%] Building CXX object CMakeFiles/mpt-avxonly.dir/llmodel_shared.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"avxonly\" -Dmpt_avxonly_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-230511 -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -pthread -MD -MT CMakeFiles/mpt-avxonly.dir/llmodel_shared.cpp.o -MF CMakeFiles/mpt-avxonly.dir/llmodel_shared.cpp.o.d -o CMakeFiles/mpt-avxonly.dir/llmodel_shared.cpp.o -c /build/gpt4all/gpt4all-backend/llmodel_shared.cpp
localai-api-1  | [ 86%] Linking CXX shared library libmpt-avxonly.so
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/mpt-avxonly.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -fPIC -O3 -DNDEBUG -shared -Wl,-soname,libmpt-avxonly.so -o libmpt-avxonly.so "CMakeFiles/mpt-avxonly.dir/mpt.cpp.o" "CMakeFiles/mpt-avxonly.dir/utils.cpp.o" "CMakeFiles/mpt-avxonly.dir/llmodel_shared.cpp.o" "CMakeFiles/ggml-230511-avxonly.dir/llama.cpp-230511/ggml.c.o"  -pthread 
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 86%] Built target mpt-avxonly
localai-api-1  | make  -f CMakeFiles/bert-avxonly.dir/build.make CMakeFiles/bert-avxonly.dir/depend
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | cd /build/gpt4all/gpt4all-bindings/golang/buildllm && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm/CMakeFiles/bert-avxonly.dir/DependInfo.cmake "--color="
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | make  -f CMakeFiles/bert-avxonly.dir/build.make CMakeFiles/bert-avxonly.dir/build
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 87%] Building CXX object CMakeFiles/bert-avxonly.dir/bert.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"avxonly\" -Dbert_avxonly_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-mainline -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/bert-avxonly.dir/bert.cpp.o -MF CMakeFiles/bert-avxonly.dir/bert.cpp.o.d -o CMakeFiles/bert-avxonly.dir/bert.cpp.o -c /build/gpt4all/gpt4all-backend/bert.cpp
localai-api-1  | [ 88%] Building CXX object CMakeFiles/bert-avxonly.dir/utils.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"avxonly\" -Dbert_avxonly_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-mainline -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/bert-avxonly.dir/utils.cpp.o -MF CMakeFiles/bert-avxonly.dir/utils.cpp.o.d -o CMakeFiles/bert-avxonly.dir/utils.cpp.o -c /build/gpt4all/gpt4all-backend/utils.cpp
localai-api-1  | [ 90%] Building CXX object CMakeFiles/bert-avxonly.dir/llmodel_shared.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"avxonly\" -Dbert_avxonly_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-mainline -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/bert-avxonly.dir/llmodel_shared.cpp.o -MF CMakeFiles/bert-avxonly.dir/llmodel_shared.cpp.o.d -o CMakeFiles/bert-avxonly.dir/llmodel_shared.cpp.o -c /build/gpt4all/gpt4all-backend/llmodel_shared.cpp
localai-api-1  | [ 91%] Linking CXX shared library libbert-avxonly.so
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/bert-avxonly.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -fPIC -O3 -DNDEBUG -shared -Wl,-soname,libbert-avxonly.so -o libbert-avxonly.so "CMakeFiles/bert-avxonly.dir/bert.cpp.o" "CMakeFiles/bert-avxonly.dir/utils.cpp.o" "CMakeFiles/bert-avxonly.dir/llmodel_shared.cpp.o"  libllama-mainline-avxonly.a -pthread 
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 91%] Built target bert-avxonly
localai-api-1  | make  -f CMakeFiles/starcoder-avxonly.dir/build.make CMakeFiles/starcoder-avxonly.dir/depend
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | cd /build/gpt4all/gpt4all-bindings/golang/buildllm && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm/CMakeFiles/starcoder-avxonly.dir/DependInfo.cmake "--color="
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | make  -f CMakeFiles/starcoder-avxonly.dir/build.make CMakeFiles/starcoder-avxonly.dir/build
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 92%] Building CXX object CMakeFiles/starcoder-avxonly.dir/starcoder.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"avxonly\" -Dstarcoder_avxonly_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-mainline -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/starcoder-avxonly.dir/starcoder.cpp.o -MF CMakeFiles/starcoder-avxonly.dir/starcoder.cpp.o.d -o CMakeFiles/starcoder-avxonly.dir/starcoder.cpp.o -c /build/gpt4all/gpt4all-backend/starcoder.cpp
localai-api-1  | /build/gpt4all/gpt4all-backend/starcoder.cpp: In function 'bool starcoder_eval(const starcoder_model&, int, int, const std::vector<int>&, std::vector<float>&, size_t&)':
localai-api-1  | /build/gpt4all/gpt4all-backend/starcoder.cpp:470:18: warning: unused variable 'head_dim' [-Wunused-variable]
localai-api-1  |   470 |     const size_t head_dim = n_embd / n_head;
localai-api-1  |       |                  ^~~~~~~~
localai-api-1  | [ 93%] Building CXX object CMakeFiles/starcoder-avxonly.dir/utils.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"avxonly\" -Dstarcoder_avxonly_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-mainline -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/starcoder-avxonly.dir/utils.cpp.o -MF CMakeFiles/starcoder-avxonly.dir/utils.cpp.o.d -o CMakeFiles/starcoder-avxonly.dir/utils.cpp.o -c /build/gpt4all/gpt4all-backend/utils.cpp
localai-api-1  | [ 94%] Building CXX object CMakeFiles/starcoder-avxonly.dir/llmodel_shared.cpp.o
localai-api-1  | /usr/bin/c++ -DGGML_BUILD_VARIANT=\"avxonly\" -Dstarcoder_avxonly_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -I/build/gpt4all/gpt4all-backend/llama.cpp-mainline -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/starcoder-avxonly.dir/llmodel_shared.cpp.o -MF CMakeFiles/starcoder-avxonly.dir/llmodel_shared.cpp.o.d -o CMakeFiles/starcoder-avxonly.dir/llmodel_shared.cpp.o -c /build/gpt4all/gpt4all-backend/llmodel_shared.cpp
localai-api-1  | [ 95%] Linking CXX shared library libstarcoder-avxonly.so
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/starcoder-avxonly.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -fPIC -O3 -DNDEBUG -shared -Wl,-soname,libstarcoder-avxonly.so -o libstarcoder-avxonly.so "CMakeFiles/starcoder-avxonly.dir/starcoder.cpp.o" "CMakeFiles/starcoder-avxonly.dir/utils.cpp.o" "CMakeFiles/starcoder-avxonly.dir/llmodel_shared.cpp.o"  libllama-mainline-avxonly.a -pthread 
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 95%] Built target starcoder-avxonly
localai-api-1  | make  -f CMakeFiles/llmodel.dir/build.make CMakeFiles/llmodel.dir/depend
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | cd /build/gpt4all/gpt4all-bindings/golang/buildllm && /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_depends "Unix Makefiles" /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-backend /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm /build/gpt4all/gpt4all-bindings/golang/buildllm/CMakeFiles/llmodel.dir/DependInfo.cmake "--color="
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | make  -f CMakeFiles/llmodel.dir/build.make CMakeFiles/llmodel.dir/build
localai-api-1  | make[4]: Entering directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [ 96%] Building CXX object CMakeFiles/llmodel.dir/llmodel.cpp.o
localai-api-1  | /usr/bin/c++ -DLIB_FILE_EXT=\".so\" -Dllmodel_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/llmodel.dir/llmodel.cpp.o -MF CMakeFiles/llmodel.dir/llmodel.cpp.o.d -o CMakeFiles/llmodel.dir/llmodel.cpp.o -c /build/gpt4all/gpt4all-backend/llmodel.cpp
localai-api-1  | [ 97%] Building CXX object CMakeFiles/llmodel.dir/llmodel_shared.cpp.o
localai-api-1  | /usr/bin/c++ -DLIB_FILE_EXT=\".so\" -Dllmodel_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/llmodel.dir/llmodel_shared.cpp.o -MF CMakeFiles/llmodel.dir/llmodel_shared.cpp.o.d -o CMakeFiles/llmodel.dir/llmodel_shared.cpp.o -c /build/gpt4all/gpt4all-backend/llmodel_shared.cpp
localai-api-1  | [ 98%] Building CXX object CMakeFiles/llmodel.dir/llmodel_c.cpp.o
localai-api-1  | /usr/bin/c++ -DLIB_FILE_EXT=\".so\" -Dllmodel_EXPORTS -I/build/gpt4all/gpt4all-bindings/golang/buildllm -O3 -DNDEBUG -std=gnu++2a -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wno-multichar -MD -MT CMakeFiles/llmodel.dir/llmodel_c.cpp.o -MF CMakeFiles/llmodel.dir/llmodel_c.cpp.o.d -o CMakeFiles/llmodel.dir/llmodel_c.cpp.o -c /build/gpt4all/gpt4all-backend/llmodel_c.cpp
localai-api-1  | [100%] Linking CXX shared library libllmodel.so
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_link_script CMakeFiles/llmodel.dir/link.txt --verbose=1
localai-api-1  | /usr/bin/c++ -fPIC -O3 -DNDEBUG -shared -Wl,-soname,libllmodel.so.0 -o libllmodel.so.0.3.0 CMakeFiles/llmodel.dir/llmodel.cpp.o CMakeFiles/llmodel.dir/llmodel_shared.cpp.o CMakeFiles/llmodel.dir/llmodel_c.cpp.o 
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_symlink_library libllmodel.so.0.3.0 libllmodel.so.0 libllmodel.so
localai-api-1  | make[4]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | [100%] Built target llmodel
localai-api-1  | make[3]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | /usr/local/lib/python3.9/dist-packages/cmake/data/bin/cmake -E cmake_progress_start /build/gpt4all/gpt4all-bindings/golang/buildllm/CMakeFiles 0
localai-api-1  | make[2]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang/buildllm'
localai-api-1  | cd buildllm && cp -rf CMakeFiles/llmodel.dir/llmodel_c.cpp.o ../llmodel_c.o
localai-api-1  | cd buildllm && cp -rf CMakeFiles/llmodel.dir/llmodel.cpp.o ../llmodel.o
localai-api-1  | ar src libgpt4all.a llmodel.o binding.o
localai-api-1  | make[1]: Leaving directory '/build/gpt4all/gpt4all-bindings/golang'
localai-api-1  | mkdir -p backend-assets/gpt4all
localai-api-1  | cp: cannot stat 'gpt4all/gpt4all-bindings/golang/buildllm/*.dylib': No such file or directory
localai-api-1  | cp: cannot stat 'gpt4all/gpt4all-bindings/golang/buildllm/*.dll': No such file or directory
localai-api-1  | CGO_LDFLAGS="" C_INCLUDE_PATH=/build/gpt4all/gpt4all-bindings/golang/ LIBRARY_PATH=/build/gpt4all/gpt4all-bindings/golang/ \
localai-api-1  | go build -ldflags "-X "github.com/go-skynet/LocalAI/internal.Version=v1.23.2" -X "github.com/go-skynet/LocalAI/internal.Commit=acd829a7a0e1623c0871c8b34c36c76afd4feac8"" -tags "" -o backend-assets/grpc/gpt4all ./cmd/grpc/gpt4all/
localai-api-1  | CGO_LDFLAGS="" C_INCLUDE_PATH=/build/go-ggml-transformers LIBRARY_PATH=/build/go-ggml-transformers \
localai-api-1  | go build -ldflags "-X "github.com/go-skynet/LocalAI/internal.Version=v1.23.2" -X "github.com/go-skynet/LocalAI/internal.Commit=acd829a7a0e1623c0871c8b34c36c76afd4feac8"" -tags "" -o backend-assets/grpc/dolly ./cmd/grpc/dolly/
localai-api-1  | CGO_LDFLAGS="" C_INCLUDE_PATH=/build/go-ggml-transformers LIBRARY_PATH=/build/go-ggml-transformers \
localai-api-1  | go build -ldflags "-X "github.com/go-skynet/LocalAI/internal.Version=v1.23.2" -X "github.com/go-skynet/LocalAI/internal.Commit=acd829a7a0e1623c0871c8b34c36c76afd4feac8"" -tags "" -o backend-assets/grpc/gpt2 ./cmd/grpc/gpt2/
localai-api-1  | CGO_LDFLAGS="" C_INCLUDE_PATH=/build/go-ggml-transformers LIBRARY_PATH=/build/go-ggml-transformers \
localai-api-1  | go build -ldflags "-X "github.com/go-skynet/LocalAI/internal.Version=v1.23.2" -X "github.com/go-skynet/LocalAI/internal.Commit=acd829a7a0e1623c0871c8b34c36c76afd4feac8"" -tags "" -o backend-assets/grpc/gptj ./cmd/grpc/gptj/
localai-api-1  | CGO_LDFLAGS="" C_INCLUDE_PATH=/build/go-ggml-transformers LIBRARY_PATH=/build/go-ggml-transformers \
localai-api-1  | go build -ldflags "-X "github.com/go-skynet/LocalAI/internal.Version=v1.23.2" -X "github.com/go-skynet/LocalAI/internal.Commit=acd829a7a0e1623c0871c8b34c36c76afd4feac8"" -tags "" -o backend-assets/grpc/gptneox ./cmd/grpc/gptneox/
localai-api-1  | CGO_LDFLAGS="" C_INCLUDE_PATH=/build/go-ggml-transformers LIBRARY_PATH=/build/go-ggml-transformers \
localai-api-1  | go build -ldflags "-X "github.com/go-skynet/LocalAI/internal.Version=v1.23.2" -X "github.com/go-skynet/LocalAI/internal.Commit=acd829a7a0e1623c0871c8b34c36c76afd4feac8"" -tags "" -o backend-assets/grpc/mpt ./cmd/grpc/mpt/
localai-api-1  | CGO_LDFLAGS="" C_INCLUDE_PATH=/build/go-ggml-transformers LIBRARY_PATH=/build/go-ggml-transformers \
localai-api-1  | go build -ldflags "-X "github.com/go-skynet/LocalAI/internal.Version=v1.23.2" -X "github.com/go-skynet/LocalAI/internal.Commit=acd829a7a0e1623c0871c8b34c36c76afd4feac8"" -tags "" -o backend-assets/grpc/replit ./cmd/grpc/replit/
localai-api-1  | CGO_LDFLAGS="" C_INCLUDE_PATH=/build/go-ggml-transformers LIBRARY_PATH=/build/go-ggml-transformers \
localai-api-1  | go build -ldflags "-X "github.com/go-skynet/LocalAI/internal.Version=v1.23.2" -X "github.com/go-skynet/LocalAI/internal.Commit=acd829a7a0e1623c0871c8b34c36c76afd4feac8"" -tags "" -o backend-assets/grpc/starcoder ./cmd/grpc/starcoder/
localai-api-1  | cd go-rwkv && cd rwkv.cpp &&	cmake . -DRWKV_BUILD_SHARED_LIBRARY=OFF &&	cmake --build . && 	cp librwkv.a ..
localai-api-1  | -- The C compiler identification is GNU 10.2.1
localai-api-1  | -- The CXX compiler identification is GNU 10.2.1
localai-api-1  | -- Detecting C compiler ABI info
localai-api-1  | -- Detecting C compiler ABI info - done
localai-api-1  | -- Check for working C compiler: /usr/bin/cc - skipped
localai-api-1  | -- Detecting C compile features
localai-api-1  | -- Detecting C compile features - done
localai-api-1  | -- Detecting CXX compiler ABI info
localai-api-1  | -- Detecting CXX compiler ABI info - done
localai-api-1  | -- Check for working CXX compiler: /usr/bin/c++ - skipped
localai-api-1  | -- Detecting CXX compile features
localai-api-1  | -- Detecting CXX compile features - done
localai-api-1  | -- Performing Test CMAKE_HAVE_LIBC_PTHREAD
localai-api-1  | -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Failed
localai-api-1  | -- Check if compiler accepts -pthread
localai-api-1  | -- Check if compiler accepts -pthread - yes
localai-api-1  | -- Found Threads: TRUE  
localai-api-1  | -- CMAKE_SYSTEM_PROCESSOR: x86_64
localai-api-1  | -- x86 detected
localai-api-1  | -- Configuring done (0.7s)
localai-api-1  | -- Generating done (0.0s)
localai-api-1  | -- Build files have been written to: /build/go-rwkv/rwkv.cpp
localai-api-1  | gmake[1]: Entering directory '/build/go-rwkv/rwkv.cpp'
localai-api-1  | gmake[2]: Entering directory '/build/go-rwkv/rwkv.cpp'
localai-api-1  | gmake[3]: Entering directory '/build/go-rwkv/rwkv.cpp'
localai-api-1  | gmake[3]: Leaving directory '/build/go-rwkv/rwkv.cpp'
localai-api-1  | gmake[3]: Entering directory '/build/go-rwkv/rwkv.cpp'
localai-api-1  | [  7%] Building C object CMakeFiles/ggml.dir/ggml/src/ggml.c.o
localai-api-1  | /build/go-rwkv/rwkv.cpp/ggml/src/ggml.c: In function ‘ggml_compute_forward_win_part_f32’:
localai-api-1  | /build/go-rwkv/rwkv.cpp/ggml/src/ggml.c:13064:19: warning: unused variable ‘ne3’ [-Wunused-variable]
localai-api-1  | 13064 |     const int64_t ne3 = dst->ne[3];
localai-api-1  |       |                   ^~~
localai-api-1  | gmake[3]: Leaving directory '/build/go-rwkv/rwkv.cpp'
localai-api-1  | [  7%] Built target ggml
localai-api-1  | gmake[3]: Entering directory '/build/go-rwkv/rwkv.cpp'
localai-api-1  | gmake[3]: Leaving directory '/build/go-rwkv/rwkv.cpp'
localai-api-1  | gmake[3]: Entering directory '/build/go-rwkv/rwkv.cpp'
localai-api-1  | [ 15%] Building CXX object CMakeFiles/rwkv.dir/rwkv.cpp.o
localai-api-1  | /build/go-rwkv/rwkv.cpp/rwkv.cpp: In function ‘bool rwkv_fread_string(FILE*, size_t, std::string&)’:
localai-api-1  | /build/go-rwkv/rwkv.cpp/rwkv.cpp:149:18: warning: cast from type ‘const char*’ to type ‘void*’ casts away qualifiers [-Wcast-qual]
localai-api-1  |   149 |     return fread((void *) dest.data(), length, 1, file) == 1;
localai-api-1  |       |                  ^~~~~~~~~~~~~~~~~~~~
localai-api-1  | /build/go-rwkv/rwkv.cpp/rwkv.cpp: At global scope:
localai-api-1  | /build/go-rwkv/rwkv.cpp/rwkv.cpp:223:21: warning: ‘rwkv_type_to_string’ initialized and declared ‘extern’
localai-api-1  |   223 | extern const char * rwkv_type_to_string[TYPE_COUNT + 1] = {"float32", "float16", "Q4_0", "Q4_1", "Q4_1_O", "Q4_2", "Q4_3", "Q5_0", "Q5_1", "Q8_0", "unknown"};
localai-api-1  |       |                     ^~~~~~~~~~~~~~~~~~~
localai-api-1  | /build/go-rwkv/rwkv.cpp/rwkv.cpp: In function ‘bool rwkv_gpu_offload_layers(const rwkv_context*, uint32_t)’:
localai-api-1  | /build/go-rwkv/rwkv.cpp/rwkv.cpp:1280:58: warning: unused parameter ‘ctx’ [-Wunused-parameter]
localai-api-1  |  1280 | bool rwkv_gpu_offload_layers(const struct rwkv_context * ctx, const uint32_t n_gpu_layers) {
localai-api-1  |       |                              ~~~~~~~~~~~~~~~~~~~~~~~~~~~~^~~
localai-api-1  | /build/go-rwkv/rwkv.cpp/rwkv.cpp:1280:78: warning: unused parameter ‘n_gpu_layers’ [-Wunused-parameter]
localai-api-1  |  1280 | bool rwkv_gpu_offload_layers(const struct rwkv_context * ctx, const uint32_t n_gpu_layers) {
localai-api-1  |       |                                                               ~~~~~~~~~~~~~~~^~~~~~~~~~~~
localai-api-1  | [ 23%] Linking CXX static library librwkv.a
localai-api-1  | gmake[3]: Leaving directory '/build/go-rwkv/rwkv.cpp'
localai-api-1  | [ 23%] Built target rwkv
localai-api-1  | gmake[3]: Entering directory '/build/go-rwkv/rwkv.cpp'
localai-api-1  | gmake[3]: Leaving directory '/build/go-rwkv/rwkv.cpp'
localai-api-1  | gmake[3]: Entering directory '/build/go-rwkv/rwkv.cpp'
localai-api-1  | [ 30%] Building C object tests/CMakeFiles/test_ggml_basics.dir/test_ggml_basics.c.o
localai-api-1  | [ 38%] Linking CXX executable ../bin/test_ggml_basics
localai-api-1  | gmake[3]: Leaving directory '/build/go-rwkv/rwkv.cpp'
localai-api-1  | [ 38%] Built target test_ggml_basics
localai-api-1  | gmake[3]: Entering directory '/build/go-rwkv/rwkv.cpp'
localai-api-1  | gmake[3]: Leaving directory '/build/go-rwkv/rwkv.cpp'
localai-api-1  | gmake[3]: Entering directory '/build/go-rwkv/rwkv.cpp'
localai-api-1  | [ 46%] Building C object tests/CMakeFiles/test_tiny_rwkv.dir/test_tiny_rwkv.c.o
localai-api-1  | /build/go-rwkv/rwkv.cpp/tests/test_tiny_rwkv.c: In function ‘test_model’:
localai-api-1  | /build/go-rwkv/rwkv.cpp/tests/test_tiny_rwkv.c:60:45: warning: implicit conversion from ‘float’ to ‘double’ when passing argument to function [-Wdouble-promotion]
localai-api-1  |    60 |     fprintf(stderr, "Difference sum: %f\n", diff_sum);
localai-api-1  |       |                                             ^~~~~~~~
localai-api-1  | /build/go-rwkv/rwkv.cpp/tests/test_tiny_rwkv.c:74:54: warning: implicit conversion from ‘float’ to ‘double’ when passing argument to function [-Wdouble-promotion]
localai-api-1  |    74 |     fprintf(stderr, "Sequence difference sum: %f\n", diff_sum);
localai-api-1  |       |                                                      ^~~~~~~~
localai-api-1  | [ 53%] Linking CXX executable ../bin/test_tiny_rwkv
localai-api-1  | gmake[3]: Leaving directory '/build/go-rwkv/rwkv.cpp'
localai-api-1  | [ 53%] Built target test_tiny_rwkv
localai-api-1  | gmake[3]: Entering directory '/build/go-rwkv/rwkv.cpp'
localai-api-1  | gmake[3]: Leaving directory '/build/go-rwkv/rwkv.cpp'
localai-api-1  | gmake[3]: Entering directory '/build/go-rwkv/rwkv.cpp'
localai-api-1  | [ 61%] Building C object tests/CMakeFiles/test_context_cloning.dir/test_context_cloning.c.o
localai-api-1  | /build/go-rwkv/rwkv.cpp/tests/test_context_cloning.c:7:5: warning: function declaration isn’t a prototype [-Wstrict-prototypes]
localai-api-1  |     7 | int main() {
localai-api-1  |       |     ^~~~
localai-api-1  | [ 69%] Linking CXX executable ../bin/test_context_cloning
localai-api-1  | gmake[3]: Leaving directory '/build/go-rwkv/rwkv.cpp'
localai-api-1  | [ 69%] Built target test_context_cloning
localai-api-1  | gmake[3]: Entering directory '/build/go-rwkv/rwkv.cpp'
localai-api-1  | gmake[3]: Leaving directory '/build/go-rwkv/rwkv.cpp'
localai-api-1  | gmake[3]: Entering directory '/build/go-rwkv/rwkv.cpp'
localai-api-1  | [ 76%] Building C object extras/CMakeFiles/rwkv_cpu_info.dir/cpu_info.c.o
localai-api-1  | /build/go-rwkv/rwkv.cpp/extras/cpu_info.c:5:5: warning: function declaration isn’t a prototype [-Wstrict-prototypes]
localai-api-1  |     5 | int main() {
localai-api-1  |       |     ^~~~
localai-api-1  | [ 84%] Linking CXX executable ../bin/rwkv_cpu_info
localai-api-1  | gmake[3]: Leaving directory '/build/go-rwkv/rwkv.cpp'
localai-api-1  | [ 84%] Built target rwkv_cpu_info
localai-api-1  | gmake[3]: Entering directory '/build/go-rwkv/rwkv.cpp'
localai-api-1  | gmake[3]: Leaving directory '/build/go-rwkv/rwkv.cpp'
localai-api-1  | gmake[3]: Entering directory '/build/go-rwkv/rwkv.cpp'
localai-api-1  | [ 92%] Building C object extras/CMakeFiles/rwkv_quantize.dir/quantize.c.o
localai-api-1  | [100%] Linking CXX executable ../bin/rwkv_quantize
localai-api-1  | gmake[3]: Leaving directory '/build/go-rwkv/rwkv.cpp'
localai-api-1  | [100%] Built target rwkv_quantize
localai-api-1  | gmake[2]: Leaving directory '/build/go-rwkv/rwkv.cpp'
localai-api-1  | gmake[1]: Leaving directory '/build/go-rwkv/rwkv.cpp'
localai-api-1  | CGO_LDFLAGS="" C_INCLUDE_PATH=/build/go-rwkv LIBRARY_PATH=/build/go-rwkv \
localai-api-1  | go build -ldflags "-X "github.com/go-skynet/LocalAI/internal.Version=v1.23.2" -X "github.com/go-skynet/LocalAI/internal.Commit=acd829a7a0e1623c0871c8b34c36c76afd4feac8"" -tags "" -o backend-assets/grpc/rwkv ./cmd/grpc/rwkv/
localai-api-1  | cd whisper.cpp && make libwhisper.a
localai-api-1  | make[1]: Entering directory '/build/whisper.cpp'
localai-api-1  | I whisper.cpp build info: 
localai-api-1  | I UNAME_S:  Linux
localai-api-1  | I UNAME_P:  unknown
localai-api-1  | I UNAME_M:  x86_64
localai-api-1  | I CFLAGS:   -I.              -O3 -DNDEBUG -std=c11   -fPIC -D_XOPEN_SOURCE=600 -pthread -msse3
localai-api-1  | I CXXFLAGS: -I. -I./examples -O3 -DNDEBUG -std=c++11 -fPIC -D_XOPEN_SOURCE=600 -pthread
localai-api-1  | I LDFLAGS:  
localai-api-1  | I CC:       cc (Debian 10.2.1-6) 10.2.1 20210110
localai-api-1  | I CXX:      g++ (Debian 10.2.1-6) 10.2.1 20210110
localai-api-1  | 
localai-api-1  | cc  -I.              -O3 -DNDEBUG -std=c11   -fPIC -D_XOPEN_SOURCE=600 -pthread -msse3   -c ggml.c -o ggml.o
localai-api-1  | g++ -I. -I./examples -O3 -DNDEBUG -std=c++11 -fPIC -D_XOPEN_SOURCE=600 -pthread -c whisper.cpp -o whisper.o
localai-api-1  | ar rcs libwhisper.a ggml.o whisper.o
localai-api-1  | make[1]: Leaving directory '/build/whisper.cpp'
localai-api-1  | CGO_LDFLAGS="" C_INCLUDE_PATH=/build/whisper.cpp LIBRARY_PATH=/build/whisper.cpp \
localai-api-1  | go build -ldflags "-X "github.com/go-skynet/LocalAI/internal.Version=v1.23.2" -X "github.com/go-skynet/LocalAI/internal.Commit=acd829a7a0e1623c0871c8b34c36c76afd4feac8"" -tags "" -o backend-assets/grpc/whisper ./cmd/grpc/whisper/
localai-api-1  | I local-ai build info:(B
localai-api-1  | I BUILD_TYPE: (B
localai-api-1  | I GO_TAGS: (B
localai-api-1  | I LD_FLAGS: -X "github.com/go-skynet/LocalAI/internal.Version=v1.23.2" -X "github.com/go-skynet/LocalAI/internal.Commit=acd829a7a0e1623c0871c8b34c36c76afd4feac8"(B
localai-api-1  | CGO_LDFLAGS="" go build -ldflags "-X "github.com/go-skynet/LocalAI/internal.Version=v1.23.2" -X "github.com/go-skynet/LocalAI/internal.Commit=acd829a7a0e1623c0871c8b34c36c76afd4feac8"" -tags "" -o local-ai ./
localai-api-1  | 3:29PM DBG no galleries to load
localai-api-1  | 3:29PM INF Starting LocalAI using 4 threads, with models path: /models
localai-api-1  | 3:29PM INF LocalAI version: v1.23.2 (acd829a7a0e1623c0871c8b34c36c76afd4feac8)
localai-api-1  | 3:29PM DBG Model: gpt4all-j (config: {PredictionOptions:{Model:ggml-gpt4all-j.bin Language: N:0 TopP:0.7 TopK:80 Temperature:0.2 Maxtokens:0 Echo:false Batch:0 F16:false IgnoreEOS:false RepeatPenalty:0 Keep:0 MirostatETA:0 MirostatTAU:0 Mirostat:0 FrequencyPenalty:0 TFZ:0 TypicalP:0 Seed:0 NegativePrompt: RopeFreqBase:0 RopeFreqScale:0 NegativePromptScale:0} Name:gpt4all-j StopWords:[] Cutstrings:[] TrimSpace:[] ContextSize:1024 F16:false NUMA:false Threads:0 Debug:false Roles:map[] Embeddings:false Backend:gpt4all-j TemplateConfig:{Chat:gpt4all-chat ChatMessage: Completion:gpt4all-completion Edit: Functions:} MirostatETA:0 MirostatTAU:0 Mirostat:0 NGPULayers:0 MMap:false MMlock:false LowVRAM:false TensorSplit: MainGPU: ImageGenerationAssets: PromptCachePath: PromptCacheAll:false PromptCacheRO:false Grammar: PromptStrings:[] InputStrings:[] InputToken:[] functionCallString: functionCallNameString: FunctionsConfig:{DisableNoAction:false NoActionFunctionName: NoActionDescriptionName:} SystemPrompt: RMSNormEps:0 NGQA:0})
localai-api-1  | 3:29PM DBG Extracting backend assets files to /tmp/localai/backend_data
localai-api-1  | 3:29PM DBG Checking "ggml-gpt4all-j.bin" exists and matches SHA
localai-api-1  | 3:29PM DBG File "ggml-gpt4all-j.bin" already exists and matches the SHA. Skipping download
localai-api-1  | 3:29PM DBG Prompt template "gpt4all-completion" written
localai-api-1  | 3:29PM DBG Prompt template "gpt4all-chat" written
localai-api-1  | 3:29PM DBG Written config file /models/gpt4all-j.yaml
localai-api-1  | 
localai-api-1  |  ┌───────────────────────────────────────────────────┐ 
localai-api-1  |  │                   Fiber v2.48.0                   │ 
localai-api-1  |  │               http://127.0.0.1:8080               │ 
localai-api-1  |  │       (bound on host 0.0.0.0 and port 8080)       │ 
localai-api-1  |  │                                                   │ 
localai-api-1  |  │ Handlers ............ 32  Processes ........... 1 │ 
localai-api-1  |  │ Prefork ....... Disabled  PID .............. 8757 │ 
localai-api-1  |  └───────────────────────────────────────────────────┘ 
localai-api-1  | 
localai-api-1  | 3:29PM DBG Request received: 
localai-api-1  | 3:29PM DBG Configuration read: &{PredictionOptions:{Model:ggml-gpt4all-j Language: N:0 TopP:0.7 TopK:80 Temperature:0.9 Maxtokens:512 Echo:false Batch:0 F16:false IgnoreEOS:false RepeatPenalty:0 Keep:0 MirostatETA:0 MirostatTAU:0 Mirostat:0 FrequencyPenalty:0 TFZ:0 TypicalP:0 Seed:0 NegativePrompt: RopeFreqBase:0 RopeFreqScale:0 NegativePromptScale:0} Name: StopWords:[] Cutstrings:[] TrimSpace:[] ContextSize:512 F16:false NUMA:false Threads:4 Debug:true Roles:map[] Embeddings:false Backend: TemplateConfig:{Chat: ChatMessage: Completion: Edit: Functions:} MirostatETA:0 MirostatTAU:0 Mirostat:0 NGPULayers:0 MMap:false MMlock:false LowVRAM:false TensorSplit: MainGPU: ImageGenerationAssets: PromptCachePath: PromptCacheAll:false PromptCacheRO:false Grammar: PromptStrings:[] InputStrings:[] InputToken:[] functionCallString: functionCallNameString: FunctionsConfig:{DisableNoAction:false NoActionFunctionName: NoActionDescriptionName:} SystemPrompt: RMSNormEps:0 NGQA:0}
localai-api-1  | 3:29PM DBG Parameters: &{PredictionOptions:{Model:ggml-gpt4all-j Language: N:0 TopP:0.7 TopK:80 Temperature:0.9 Maxtokens:512 Echo:false Batch:0 F16:false IgnoreEOS:false RepeatPenalty:0 Keep:0 MirostatETA:0 MirostatTAU:0 Mirostat:0 FrequencyPenalty:0 TFZ:0 TypicalP:0 Seed:0 NegativePrompt: RopeFreqBase:0 RopeFreqScale:0 NegativePromptScale:0} Name: StopWords:[] Cutstrings:[] TrimSpace:[] ContextSize:512 F16:false NUMA:false Threads:4 Debug:true Roles:map[] Embeddings:false Backend: TemplateConfig:{Chat: ChatMessage: Completion: Edit: Functions:} MirostatETA:0 MirostatTAU:0 Mirostat:0 NGPULayers:0 MMap:false MMlock:false LowVRAM:false TensorSplit: MainGPU: ImageGenerationAssets: PromptCachePath: PromptCacheAll:false PromptCacheRO:false Grammar: PromptStrings:[] InputStrings:[] InputToken:[] functionCallString: functionCallNameString: FunctionsConfig:{DisableNoAction:false NoActionFunctionName: NoActionDescriptionName:} SystemPrompt: RMSNormEps:0 NGQA:0}
localai-api-1  | 3:29PM DBG Prompt (before templating): How are you?
localai-api-1  | 3:29PM DBG Template found, input modified to: The prompt below is a question to answer, a task to complete, or a conversation to respond to; decide which and write an appropriate response.
localai-api-1  | ### Prompt:
localai-api-1  | How are you?
localai-api-1  | ### Response:
localai-api-1  | 
localai-api-1  | 3:29PM DBG Prompt (after templating): The prompt below is a question to answer, a task to complete, or a conversation to respond to; decide which and write an appropriate response.
localai-api-1  | ### Prompt:
localai-api-1  | How are you?
localai-api-1  | ### Response:
localai-api-1  | 
localai-api-1  | 3:29PM DBG Loading model 'ggml-gpt4all-j' greedly from all the available backends: llama, gpt4all, falcon, gptneox, bert-embeddings, falcon-ggml, gptj, gpt2, dolly, mpt, replit, starcoder, bloomz, rwkv, whisper, stablediffusion, piper, /build/extra/grpc/huggingface/huggingface.py
localai-api-1  | 3:29PM DBG [llama] Attempting to load
localai-api-1  | 3:29PM DBG Loading model llama from ggml-gpt4all-j
localai-api-1  | 3:29PM DBG Loading model in memory from file: /models/ggml-gpt4all-j
localai-api-1  | 3:29PM DBG Loading GRPC Model llama: {backendString:llama modelFile:ggml-gpt4all-j threads:4 assetDir:/tmp/localai/backend_data context:0xc00003e098 gRPCOptions:0xc00044a1e0 externalBackends:map[huggingface-embeddings:/build/extra/grpc/huggingface/huggingface.py]}
localai-api-1  | 3:29PM DBG Loading GRPC Process: /tmp/localai/backend_data/backend-assets/grpc/llama
localai-api-1  | 3:29PM DBG GRPC Service for ggml-gpt4all-j will be running at: '127.0.0.1:38783'
localai-api-1  | 3:29PM DBG GRPC Service state dir: /tmp/go-processmanager2536514090
localai-api-1  | 3:29PM DBG GRPC Service Started
localai-api-1  | rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:38783: connect: connection refused"
localai-api-1  | 3:29PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:38783): stderr 2023/08/11 15:29:50 gRPC Server listening at 127.0.0.1:38783
localai-api-1  | 3:29PM DBG GRPC Service Ready
localai-api-1  | 3:29PM DBG GRPC: Loading model with options: {state:{NoUnkeyedLiterals:{} DoNotCompare:[] DoNotCopy:[] atomicMessageInfo:<nil>} sizeCache:0 unknownFields:[] Model:/models/ggml-gpt4all-j ContextSize:512 Seed:0 NBatch:512 F16Memory:false MLock:false MMap:false VocabOnly:false LowVRAM:false Embeddings:false NUMA:false NGPULayers:0 MainGPU: TensorSplit: Threads:4 LibrarySearchPath: RopeFreqBase:0 RopeFreqScale:0 RMSNormEps:0 NGQA:0}
localai-api-1  | 3:29PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:38783): stderr create_gpt_params: loading model /models/ggml-gpt4all-j
localai-api-1  | 3:29PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:38783): stderr llama.cpp: loading model from /models/ggml-gpt4all-j
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:38783): stderr error loading model: unexpectedly reached end of file
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:38783): stderr llama_load_model_from_file: failed to load model
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:38783): stderr llama_init_from_gpt_params: error: failed to load model '/models/ggml-gpt4all-j'
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:38783): stderr load_binding_model: error: unable to load model
localai-api-1  | 3:30PM DBG [llama] Fails: could not load model: rpc error: code = Unknown desc = failed loading model
localai-api-1  | 3:30PM DBG [gpt4all] Attempting to load
localai-api-1  | 3:30PM DBG Loading model gpt4all from ggml-gpt4all-j
localai-api-1  | 3:30PM DBG Loading model in memory from file: /models/ggml-gpt4all-j
localai-api-1  | 3:30PM DBG Loading GRPC Model gpt4all: {backendString:gpt4all modelFile:ggml-gpt4all-j threads:4 assetDir:/tmp/localai/backend_data context:0xc00003e098 gRPCOptions:0xc00044a1e0 externalBackends:map[huggingface-embeddings:/build/extra/grpc/huggingface/huggingface.py]}
localai-api-1  | 3:30PM DBG Loading GRPC Process: /tmp/localai/backend_data/backend-assets/grpc/gpt4all
localai-api-1  | 3:30PM DBG GRPC Service for ggml-gpt4all-j will be running at: '127.0.0.1:43263'
localai-api-1  | 3:30PM DBG GRPC Service state dir: /tmp/go-processmanager4039263367
localai-api-1  | 3:30PM DBG GRPC Service Started
localai-api-1  | rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:43263: connect: connection refused"
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:43263): stderr 2023/08/11 15:30:04 gRPC Server listening at 127.0.0.1:43263
localai-api-1  | 3:30PM DBG GRPC Service Ready
localai-api-1  | 3:30PM DBG GRPC: Loading model with options: {state:{NoUnkeyedLiterals:{} DoNotCompare:[] DoNotCopy:[] atomicMessageInfo:<nil>} sizeCache:0 unknownFields:[] Model:/models/ggml-gpt4all-j ContextSize:512 Seed:0 NBatch:512 F16Memory:false MLock:false MMap:false VocabOnly:false LowVRAM:false Embeddings:false NUMA:false NGPULayers:0 MainGPU: TensorSplit: Threads:4 LibrarySearchPath:/tmp/localai/backend_data/backend-assets/gpt4all RopeFreqBase:0 RopeFreqScale:0 RMSNormEps:0 NGQA:0}
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:43263): stderr load_model: error 'Invalid argument'
localai-api-1  | 3:30PM DBG [gpt4all] Fails: could not load model: rpc error: code = Unknown desc = failed loading model
localai-api-1  | 3:30PM DBG [falcon] Attempting to load
localai-api-1  | 3:30PM DBG Loading model falcon from ggml-gpt4all-j
localai-api-1  | 3:30PM DBG Loading model in memory from file: /models/ggml-gpt4all-j
localai-api-1  | 3:30PM DBG Loading GRPC Model falcon: {backendString:falcon modelFile:ggml-gpt4all-j threads:4 assetDir:/tmp/localai/backend_data context:0xc00003e098 gRPCOptions:0xc00044a1e0 externalBackends:map[huggingface-embeddings:/build/extra/grpc/huggingface/huggingface.py]}
localai-api-1  | 3:30PM DBG Loading GRPC Process: /tmp/localai/backend_data/backend-assets/grpc/falcon
localai-api-1  | 3:30PM DBG GRPC Service for ggml-gpt4all-j will be running at: '127.0.0.1:33285'
localai-api-1  | 3:30PM DBG GRPC Service state dir: /tmp/go-processmanager1271205253
localai-api-1  | 3:30PM DBG GRPC Service Started
localai-api-1  | rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:33285: connect: connection refused"
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:33285): stderr 2023/08/11 15:30:05 gRPC Server listening at 127.0.0.1:33285
localai-api-1  | 3:30PM DBG GRPC Service Ready
localai-api-1  | 3:30PM DBG GRPC: Loading model with options: {state:{NoUnkeyedLiterals:{} DoNotCompare:[] DoNotCopy:[] atomicMessageInfo:<nil>} sizeCache:0 unknownFields:[] Model:/models/ggml-gpt4all-j ContextSize:512 Seed:0 NBatch:512 F16Memory:false MLock:false MMap:false VocabOnly:false LowVRAM:false Embeddings:false NUMA:false NGPULayers:0 MainGPU: TensorSplit: Threads:4 LibrarySearchPath:/tmp/localai/backend_data/backend-assets/gpt4all RopeFreqBase:0 RopeFreqScale:0 RMSNormEps:0 NGQA:0}
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:33285): stderr falcon.cpp: loading model from /models/ggml-gpt4all-j
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:33285): stderr falcon.cpp: file version 0 - loading foundation model for quantization only
localai-api-1  | [127.0.0.1]:60508  200  -  GET      /readyz
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:33285): stderr error loading model: unexpectedly reached end of file at position 3785248281 (total read: 1835143076)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:33285): stderr falcon_init_from_file: failed to load model
localai-api-1  | 3:30PM DBG [falcon] Fails: could not load model: rpc error: code = Unknown desc = failed loading model
localai-api-1  | 3:30PM DBG [gptneox] Attempting to load
localai-api-1  | 3:30PM DBG Loading model gptneox from ggml-gpt4all-j
localai-api-1  | 3:30PM DBG Loading model in memory from file: /models/ggml-gpt4all-j
localai-api-1  | 3:30PM DBG Loading GRPC Model gptneox: {backendString:gptneox modelFile:ggml-gpt4all-j threads:4 assetDir:/tmp/localai/backend_data context:0xc00003e098 gRPCOptions:0xc00044a1e0 externalBackends:map[huggingface-embeddings:/build/extra/grpc/huggingface/huggingface.py]}
localai-api-1  | 3:30PM DBG Loading GRPC Process: /tmp/localai/backend_data/backend-assets/grpc/gptneox
localai-api-1  | 3:30PM DBG GRPC Service for ggml-gpt4all-j will be running at: '127.0.0.1:35467'
localai-api-1  | 3:30PM DBG GRPC Service state dir: /tmp/go-processmanager1950436771
localai-api-1  | 3:30PM DBG GRPC Service Started
localai-api-1  | rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:35467: connect: connection refused"
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 2023/08/11 15:30:19 gRPC Server listening at 127.0.0.1:35467
localai-api-1  | 3:30PM DBG GRPC Service Ready
localai-api-1  | 3:30PM DBG GRPC: Loading model with options: {state:{NoUnkeyedLiterals:{} DoNotCompare:[] DoNotCopy:[] atomicMessageInfo:<nil>} sizeCache:0 unknownFields:[] Model:/models/ggml-gpt4all-j ContextSize:512 Seed:0 NBatch:512 F16Memory:false MLock:false MMap:false VocabOnly:false LowVRAM:false Embeddings:false NUMA:false NGPULayers:0 MainGPU: TensorSplit: Threads:4 LibrarySearchPath:/tmp/localai/backend_data/backend-assets/gpt4all RopeFreqBase:0 RopeFreqScale:0 RMSNormEps:0 NGQA:0}
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr GGML_ASSERT: /build/go-ggml-transformers/ggml.cpp/src/ggml.c:4128: wtype != GGML_TYPE_COUNT
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr SIGABRT: abort
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr PC=0x7f1698bc2ce1 m=5 sigcode=18446744073709551610
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr signal arrived during cgo execution
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr goroutine 34 [syscall]:
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr runtime.cgocall(0x80dba0, 0xc00009d908)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/cgocall.go:157 +0x5c fp=0xc00009d8e0 sp=0xc00009d8a8 pc=0x41ee5c
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr github.com/go-skynet/go-ggml-transformers%2ecpp._Cfunc_gpt_neox_bootstrap(0x14ef0f0, 0x7f1658000b60)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	_cgo_gotypes.go:400 +0x4c fp=0xc00009d908 sp=0xc00009d8e0 pc=0x80b1cc
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr github.com/go-skynet/go-ggml-transformers%2ecpp.NewGPTNeoX.func1(0xc00023c018?, 0x16?)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/build/go-ggml-transformers/gptneox.go:23 +0x46 fp=0xc00009d948 sp=0xc00009d908 pc=0x80b5e6
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr github.com/go-skynet/go-ggml-transformers%2ecpp.NewGPTNeoX({0xc00023c018, 0x16})
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/build/go-ggml-transformers/gptneox.go:23 +0x56 fp=0xc00009d988 sp=0xc00009d948 pc=0x80b516
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr github.com/go-skynet/LocalAI/pkg/grpc/llm/transformers.(*GPTNeoX).Load(0xc0000142a0, 0x7f1698b45a68?)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/build/pkg/grpc/llm/transformers/gptneox.go:21 +0x29 fp=0xc00009d9a8 sp=0xc00009d988 pc=0x80bd69
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr github.com/go-skynet/LocalAI/pkg/grpc.(*server).LoadModel(0x988d00?, {0xc000236140?, 0x5d6286?}, 0x0?)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/build/pkg/grpc/server.go:42 +0x28 fp=0xc00009da10 sp=0xc00009d9a8 pc=0x80c888
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr github.com/go-skynet/LocalAI/pkg/grpc/proto._Backend_LoadModel_Handler({0x9669a0?, 0xc00006dd20}, {0xa49b10, 0xc00021c300}, 0xc0002220e0, 0x0)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/build/pkg/grpc/proto/backend_grpc.pb.go:236 +0x170 fp=0xc00009da68 sp=0xc00009da10 pc=0x809050
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr google.golang.org/grpc.(*Server).processUnaryRPC(0xc0000f81e0, {0xa4c798, 0xc0002c6000}, 0xc00023a000, 0xc00019ca20, 0xd0e770, 0x0)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/go/pkg/mod/google.golang.org/[email protected]/server.go:1360 +0xe23 fp=0xc00009de48 sp=0xc00009da68 pc=0x7f1c43
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr google.golang.org/grpc.(*Server).handleStream(0xc0000f81e0, {0xa4c798, 0xc0002c6000}, 0xc00023a000, 0x0)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/go/pkg/mod/google.golang.org/[email protected]/server.go:1737 +0xa36 fp=0xc00009df68 sp=0xc00009de48 pc=0x7f6d96
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr google.golang.org/grpc.(*Server).serveStreams.func1.1()
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/go/pkg/mod/google.golang.org/[email protected]/server.go:982 +0x98 fp=0xc00009dfe0 sp=0xc00009df68 pc=0x7ef618
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr runtime.goexit()
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc00009dfe8 sp=0xc00009dfe0 pc=0x481921
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr created by google.golang.org/grpc.(*Server).serveStreams.func1
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/go/pkg/mod/google.golang.org/[email protected]/server.go:980 +0x18c
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr goroutine 1 [IO wait]:
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc00018fb68 sp=0xc00018fb48 pc=0x452a96
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr runtime.netpollblock(0xc00018fbf8?, 0x41e4ef?, 0x0?)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/netpoll.go:527 +0xf7 fp=0xc00018fba0 sp=0xc00018fb68 pc=0x44b3d7
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr internal/poll.runtime_pollWait(0x7f1670782ef8, 0x72)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/netpoll.go:306 +0x89 fp=0xc00018fbc0 sp=0xc00018fba0 pc=0x47c4c9
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr internal/poll.(*pollDesc).wait(0xc0000e6280?, 0x0?, 0x0)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x32 fp=0xc00018fbe8 sp=0xc00018fbc0 pc=0x4ea7d2
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr internal/poll.(*pollDesc).waitRead(...)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr internal/poll.(*FD).Accept(0xc0000e6280)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/internal/poll/fd_unix.go:614 +0x2bd fp=0xc00018fc90 sp=0xc00018fbe8 pc=0x4f00dd
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr net.(*netFD).accept(0xc0000e6280)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/net/fd_unix.go:172 +0x35 fp=0xc00018fd48 sp=0xc00018fc90 pc=0x6015d5
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr net.(*TCPListener).accept(0xc000012618)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/net/tcpsock_posix.go:148 +0x25 fp=0xc00018fd70 sp=0xc00018fd48 pc=0x619e45
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr net.(*TCPListener).Accept(0xc000012618)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/net/tcpsock.go:297 +0x3d fp=0xc00018fda0 sp=0xc00018fd70 pc=0x618f3d
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr google.golang.org/grpc.(*Server).Serve(0xc0000f81e0, {0xa493a0?, 0xc000012618})
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/go/pkg/mod/google.golang.org/[email protected]/server.go:844 +0x475 fp=0xc00018fee8 sp=0xc00018fda0 pc=0x7ee235
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr github.com/go-skynet/LocalAI/pkg/grpc.StartServer({0x7ffe4f27fd64?, 0xc000024190?}, {0xa4bd10?, 0xc0000142a0})
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/build/pkg/grpc/server.go:121 +0x125 fp=0xc00018ff50 sp=0xc00018fee8 pc=0x80d485
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr main.main()
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/build/cmd/grpc/gptneox/main.go:20 +0x85 fp=0xc00018ff80 sp=0xc00018ff50 pc=0x80d5e5
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr runtime.main()
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/proc.go:250 +0x207 fp=0xc00018ffe0 sp=0xc00018ff80 pc=0x452667
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr runtime.goexit()
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc00018ffe8 sp=0xc00018ffe0 pc=0x481921
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr goroutine 2 [force gc (idle)]:
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc000056fb0 sp=0xc000056f90 pc=0x452a96
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr runtime.goparkunlock(...)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/proc.go:387
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr runtime.forcegchelper()
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/proc.go:305 +0xb0 fp=0xc000056fe0 sp=0xc000056fb0 pc=0x4528d0
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr runtime.goexit()
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc000056fe8 sp=0xc000056fe0 pc=0x481921
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr created by runtime.init.6
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/proc.go:293 +0x25
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr goroutine 3 [GC sweep wait]:
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc000057780 sp=0xc000057760 pc=0x452a96
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr runtime.goparkunlock(...)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/proc.go:387
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr runtime.bgsweep(0x0?)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/mgcsweep.go:278 +0x8e fp=0xc0000577c8 sp=0xc000057780 pc=0x43ec8e
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr runtime.gcenable.func1()
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/mgc.go:178 +0x26 fp=0xc0000577e0 sp=0xc0000577c8 pc=0x433f46
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr runtime.goexit()
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc0000577e8 sp=0xc0000577e0 pc=0x481921
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr created by runtime.gcenable
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/mgc.go:178 +0x6b
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr goroutine 4 [GC scavenge wait]:
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr runtime.gopark(0xc000032070?, 0xa425b8?, 0x1?, 0x0?, 0x0?)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc000057f70 sp=0xc000057f50 pc=0x452a96
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr runtime.goparkunlock(...)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/proc.go:387
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr runtime.(*scavengerState).park(0xd5a7c0)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/mgcscavenge.go:400 +0x53 fp=0xc000057fa0 sp=0xc000057f70 pc=0x43cbb3
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr runtime.bgscavenge(0x0?)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/mgcscavenge.go:628 +0x45 fp=0xc000057fc8 sp=0xc000057fa0 pc=0x43d185
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr runtime.gcenable.func2()
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/mgc.go:179 +0x26 fp=0xc000057fe0 sp=0xc000057fc8 pc=0x433ee6
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr runtime.goexit()
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc000057fe8 sp=0xc000057fe0 pc=0x481921
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr created by runtime.gcenable
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/mgc.go:179 +0xaa
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr goroutine 5 [finalizer wait]:
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr runtime.gopark(0x1a0?, 0xd5ace0?, 0x60?, 0x78?, 0xc000056770?)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc000056628 sp=0xc000056608 pc=0x452a96
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr runtime.runfinq()
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/mfinal.go:193 +0x107 fp=0xc0000567e0 sp=0xc000056628 pc=0x432f87
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr runtime.goexit()
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc0000567e8 sp=0xc0000567e0 pc=0x481921
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr created by runtime.createfing
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/mfinal.go:163 +0x45
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr goroutine 19 [select]:
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr runtime.gopark(0xc00022bf00?, 0x2?, 0x83?, 0xdf?, 0xc00022bed4?)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc00022bd60 sp=0xc00022bd40 pc=0x452a96
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr runtime.selectgo(0xc00022bf00, 0xc00022bed0, 0x624369?, 0x0, 0xc0002b0000?, 0x1)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/select.go:327 +0x7be fp=0xc00022bea0 sp=0xc00022bd60 pc=0x46267e
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr google.golang.org/grpc/internal/transport.(*controlBuffer).get(0xc0002960a0, 0x1)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/go/pkg/mod/google.golang.org/[email protected]/internal/transport/controlbuf.go:418 +0x115 fp=0xc00022bf30 sp=0xc00022bea0 pc=0x7633f5
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr google.golang.org/grpc/internal/transport.(*loopyWriter).run(0xc000222000)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/go/pkg/mod/google.golang.org/[email protected]/internal/transport/controlbuf.go:552 +0x91 fp=0xc00022bf90 sp=0xc00022bf30 pc=0x763b71
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr google.golang.org/grpc/internal/transport.NewServerTransport.func2()
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/go/pkg/mod/google.golang.org/[email protected]/internal/transport/http2_server.go:341 +0xda fp=0xc00022bfe0 sp=0xc00022bf90 pc=0x77b55a
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr runtime.goexit()
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc00022bfe8 sp=0xc00022bfe0 pc=0x481921
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr created by google.golang.org/grpc/internal/transport.NewServerTransport
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/go/pkg/mod/google.golang.org/[email protected]/internal/transport/http2_server.go:338 +0x1bb3
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr goroutine 20 [select]:
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr runtime.gopark(0xc000052f70?, 0x4?, 0x10?, 0x0?, 0xc000052ec0?)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc000052d08 sp=0xc000052ce8 pc=0x452a96
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr runtime.selectgo(0xc000052f70, 0xc000052eb8, 0x0?, 0x0, 0x0?, 0x1)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/select.go:327 +0x7be fp=0xc000052e48 sp=0xc000052d08 pc=0x46267e
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr google.golang.org/grpc/internal/transport.(*http2Server).keepalive(0xc0002c6000)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/go/pkg/mod/google.golang.org/[email protected]/internal/transport/http2_server.go:1155 +0x233 fp=0xc000052fc8 sp=0xc000052e48 pc=0x782c33
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr google.golang.org/grpc/internal/transport.NewServerTransport.func4()
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/go/pkg/mod/google.golang.org/[email protected]/internal/transport/http2_server.go:344 +0x26 fp=0xc000052fe0 sp=0xc000052fc8 pc=0x77b446
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr runtime.goexit()
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc000052fe8 sp=0xc000052fe0 pc=0x481921
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr created by google.golang.org/grpc/internal/transport.NewServerTransport
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/go/pkg/mod/google.golang.org/[email protected]/internal/transport/http2_server.go:344 +0x1bf8
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr goroutine 21 [IO wait]:
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr runtime.gopark(0x100000008?, 0xb?, 0x0?, 0x0?, 0x6?)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/proc.go:381 +0xd6 fp=0xc0002a2aa0 sp=0xc0002a2a80 pc=0x452a96
localai-api-1  | 3:30PM DBG [gptneox] Fails: could not load model: rpc error: code = Unavailable desc = error reading from server: EOF
localai-api-1  | 3:30PM DBG [bert-embeddings] Attempting to load
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr runtime.netpollblock(0x4cfc05?, 0x41e4ef?, 0x0?)
localai-api-1  | 3:30PM DBG Loading model bert-embeddings from ggml-gpt4all-j
localai-api-1  | 3:30PM DBG Loading model in memory from file: /models/ggml-gpt4all-j
localai-api-1  | 3:30PM DBG Loading GRPC Model bert-embeddings: {backendString:bert-embeddings modelFile:ggml-gpt4all-j threads:4 assetDir:/tmp/localai/backend_data context:0xc00003e098 gRPCOptions:0xc00044a1e0 externalBackends:map[huggingface-embeddings:/build/extra/grpc/huggingface/huggingface.py]}
localai-api-1  | 3:30PM DBG Loading GRPC Process: /tmp/localai/backend_data/backend-assets/grpc/bert-embeddings
localai-api-1  | 3:30PM DBG GRPC Service for ggml-gpt4all-j will be running at: '127.0.0.1:45525'
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/netpoll.go:527 +0xf7 fp=0xc0002a2ad8 sp=0xc0002a2aa0 pc=0x44b3d7
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr internal/poll.runtime_pollWait(0x7f1670782e08, 0x72)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/netpoll.go:306 +0x89 fp=0xc0002a2af8 sp=0xc0002a2ad8 pc=0x47c4c9
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr internal/poll.(*pollDesc).wait(0xc00029a000?, 0xc0002a8000?, 0x0)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x32 fp=0xc0002a2b20 sp=0xc0002a2af8 pc=0x4ea7d2
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr internal/poll.(*pollDesc).waitRead(...)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr internal/poll.(*FD).Read(0xc00029a000, {0xc0002a8000, 0x8000, 0x8000})
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/internal/poll/fd_unix.go:167 +0x299 fp=0xc0002a2bb8 sp=0xc0002a2b20 pc=0x4ebbb9
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr net.(*netFD).Read(0xc00029a000, {0xc0002a8000?, 0x1060100000000?, 0x8?})
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/net/fd_posix.go:55 +0x29 fp=0xc0002a2c00 sp=0xc0002a2bb8 pc=0x5ff449
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr net.(*conn).Read(0xc00029c000, {0xc0002a8000?, 0x18?, 0xc000100000?})
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/net/net.go:183 +0x45 fp=0xc0002a2c48 sp=0xc0002a2c00 pc=0x610f85
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr net.(*TCPConn).Read(0x800010601?, {0xc0002a8000?, 0x0?, 0xc0002a2ca8?})
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	<autogenerated>:1 +0x29 fp=0xc0002a2c78 sp=0xc0002a2c48 pc=0x624069
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr bufio.(*Reader).Read(0xc0002a6000, {0xc0002c0040, 0x9, 0x0?})
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/bufio/bufio.go:237 +0x1bb fp=0xc0002a2cb0 sp=0xc0002a2c78 pc=0x575e3b
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr io.ReadAtLeast({0xa45e00, 0xc0002a6000}, {0xc0002c0040, 0x9, 0x9}, 0x9)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/io/io.go:332 +0x9a fp=0xc0002a2cf8 sp=0xc0002a2cb0 pc=0x4c9b7a
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr io.ReadFull(...)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/io/io.go:351
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr golang.org/x/net/http2.readFrameHeader({0xc0002c0040?, 0x9?, 0xc000288060?}, {0xa45e00?, 0xc0002a6000?})
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/go/pkg/mod/golang.org/x/[email protected]/http2/frame.go:237 +0x6e fp=0xc0002a2d48 sp=0xc0002a2cf8 pc=0x74ebce
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr golang.org/x/net/http2.(*Framer).ReadFrame(0xc0002c0000)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/go/pkg/mod/golang.org/x/[email protected]/http2/frame.go:498 +0x95 fp=0xc0002a2df8 sp=0xc0002a2d48 pc=0x74f415
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr google.golang.org/grpc/internal/transport.(*http2Server).HandleStreams(0xc0002c6000, 0x0?, 0x0?)
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/go/pkg/mod/google.golang.org/[email protected]/internal/transport/http2_server.go:642 +0x167 fp=0xc0002a2f10 sp=0xc0002a2df8 pc=0x77e887
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr google.golang.org/grpc.(*Server).serveStreams(0xc0000f81e0, {0xa4c798?, 0xc0002c6000})
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/go/pkg/mod/google.golang.org/[email protected]/server.go:969 +0x162 fp=0xc0002a2f80 sp=0xc0002a2f10 pc=0x7ef362
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr google.golang.org/grpc.(*Server).handleRawConn.func1()
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/go/pkg/mod/google.golang.org/[email protected]/server.go:912 +0x46 fp=0xc0002a2fe0 sp=0xc0002a2f80 pc=0x7eec06
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr runtime.goexit()
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/usr/local/go/src/runtime/asm_amd64.s:1598 +0x1 fp=0xc0002a2fe8 sp=0xc0002a2fe0 pc=0x481921
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr created by google.golang.org/grpc.(*Server).handleRawConn
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 	/go/pkg/mod/google.golang.org/[email protected]/server.go:911 +0x185
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr 
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr rax    0x0
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr rbx    0x7f166b7fe700
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr rcx    0x7f1698bc2ce1
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr rdx    0x0
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr rdi    0x2
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr rsi    0x7f166b7fd7c0
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr rbp    0x7f166b7fdd80
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr rsp    0x7f166b7fd7c0
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr r8     0x0
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr r9     0x7f166b7fd7c0
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr r10    0x8
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr r11    0x246
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr r12    0x7f1658570270
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr r13    0x12
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr r14    0x7f166b7fdae0
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr r15    0x7f1658000bd8
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr rip    0x7f1698bc2ce1
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr rflags 0x246
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr cs     0x33
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr fs     0x0
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:35467): stderr gs     0x0
localai-api-1  | 3:30PM DBG GRPC Service state dir: /tmp/go-processmanager87202876
localai-api-1  | 3:30PM DBG GRPC Service Started
localai-api-1  | rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing: dial tcp 127.0.0.1:45525: connect: connection refused"
localai-api-1  | 3:30PM DBG GRPC(ggml-gpt4all-j-127.0.0.1:45525): stderr 2023/08/11 15:30:21 gRPC Server listening at 127.0.0.1:45525
localai-api-1  | 3:30PM DBG GRPC Service Ready
localai-api-1  | 3:30PM DBG GRPC: Loading model with options: {state:{NoUnkeyedLiterals:{} DoNotCompare:[] DoNotCopy:[] atomicMessageInfo:<nil>} sizeCache:0 unknownFields:[] Model:/models/ggml-gpt4all-j ContextSize:512 Seed:0 NBatch:512 F16Memory:false MLock:false MMap:false VocabOnly:false LowVRAM:false Embeddings:false NUMA:false NGPULayers:0 MainGPU: TensorSplit: Threads:4 LibrarySearchPath:/tmp/localai/backend_data/backend-assets/gpt4all RopeFreqBase:0 RopeFreqScale:0 RMSNormEps:0 NGQA:0}
localai-api-1  | [127.0.0.1]:42090  200  -  GET      /readyz
localai-api-1  | [127.0.0.1]:39630  200  -  GET      /readyz