neondatabase / ask-neon Goto Github PK
View Code? Open in Web Editor NEWChatbot: Search your own knowledge base by semantic similarity
Home Page: https://neon.tech/ai
Chatbot: Search your own knowledge base by semantic similarity
Home Page: https://neon.tech/ai
Run entire install step(s), with existing issues listed, and attempt to ask question in localhost browser.
I assume question should be answered by chatbot.
I receive an error in terminal:
Error: A Node.js API is used (process.nextTick) which is not supported in the Edge Runtime.
Macbook M2
Full error:
npm run dev
[email protected] dev
next dev
▲ Next.js 13.5.4
✓ Ready in 5.8s
⚠ Fast Refresh had to perform a full reload. Read more: https://nextjs.org/docs/messages/fast-refresh-reload
✓ Compiled / in 630ms (197 modules)
⚠ ./node_modules/@neondatabase/serverless/index.js
Critical dependency: the request of a dependency is an expression
Import trace for requested module:
./node_modules/@neondatabase/serverless/index.js
./node_modules/@neondatabase/serverless/index.mjs
○ Compiling /api/askme ...
Querying database...
Error: A Node.js API is used (process.nextTick) which is not supported in the Edge Runtime.
Learn more: https://nextjs.org/docs/api-reference/edge-runtime
...
⨯ node_modules/@neondatabase/serverless/index.js (43:24657) @ nextTick
⨯ uncaughtException: A Node.js API is used (process.nextTick) which is not supported in the Edge Runtime.
Learn more: https://nextjs.org/docs/api-reference/edge-runtime
null
⨯ node_modules/@neondatabase/serverless/index.js (43:24657) @ nextTick
⨯ uncaughtException: A Node.js API is used (process.nextTick) which is not supported in the Edge Runtime.
Learn more: https://nextjs.org/docs/api-reference/edge-runtime
python main.py
OSError: Cannot save file into a non-existent directory: 'processed'
Created folder called processed in /data
python main.py
Folder name starts with www.
$ python main.py
Traceback (most recent call last):
File "/ask-neon/data/main.py", line 40, in <module>
for file in os.listdir("text/" + domain + "/"):
FileNotFoundError: [Errno 2] No such file or directory: 'text/postgresql.org/'
pip install -r requirements.txt
successful install of packages
Collecting aiohttp==3.8.3
Using cached aiohttp-3.8.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.0 MB)
Collecting aiosignal==1.3.1
Using cached aiosignal-1.3.1-py3-none-any.whl (7.6 kB)
Collecting appnope==0.1.3
Using cached appnope-0.1.3-py2.py3-none-any.whl (4.4 kB)
Collecting asttokens==2.2.1
Using cached asttokens-2.2.1-py2.py3-none-any.whl (26 kB)
Collecting async-timeout==4.0.2
Using cached async_timeout-4.0.2-py3-none-any.whl (5.8 kB)
Collecting attrs==22.2.0
Using cached attrs-22.2.0-py3-none-any.whl (60 kB)
Collecting backcall==0.2.0
Using cached backcall-0.2.0-py2.py3-none-any.whl (11 kB)
Collecting beautifulsoup4==4.11.1
Using cached beautifulsoup4-4.11.1-py3-none-any.whl (128 kB)
Collecting blobfile==2.0.1
Using cached blobfile-2.0.1-py3-none-any.whl (73 kB)
Collecting bs4==0.0.1
Using cached bs4-0.0.1.tar.gz (1.1 kB)
Preparing metadata (setup.py): started
Preparing metadata (setup.py): finished with status 'done'
Collecting certifi==2022.12.7
Using cached certifi-2022.12.7-py3-none-any.whl (155 kB)
Collecting charset-normalizer==2.1.1
Using cached charset_normalizer-2.1.1-py3-none-any.whl (39 kB)
Collecting comm==0.1.2
Using cached comm-0.1.2-py3-none-any.whl (6.5 kB)
Collecting contourpy==1.0.7
Using cached contourpy-1.0.7-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (300 kB)
Collecting cycler==0.11.0
Using cached cycler-0.11.0-py3-none-any.whl (6.4 kB)
Collecting debugpy==1.6.5
Using cached debugpy-1.6.5-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB)
Collecting decorator==5.1.1
Using cached decorator-5.1.1-py3-none-any.whl (9.1 kB)
Collecting docopt==0.6.2
Using cached docopt-0.6.2.tar.gz (25 kB)
Preparing metadata (setup.py): started
Preparing metadata (setup.py): finished with status 'done'
Collecting entrypoints==0.4
Using cached entrypoints-0.4-py3-none-any.whl (5.3 kB)
Collecting executing==1.2.0
Using cached executing-1.2.0-py2.py3-none-any.whl (24 kB)
Collecting filelock==3.9.0
Using cached filelock-3.9.0-py3-none-any.whl (9.7 kB)
Collecting fonttools==4.38.0
Using cached fonttools-4.38.0-py3-none-any.whl (965 kB)
Collecting frozenlist==1.3.3
Using cached frozenlist-1.3.3-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (149 kB)
Collecting html==1.13
Using cached html-1.13.tar.gz (6.7 kB)
Preparing metadata (setup.py): started
Preparing metadata (setup.py): finished with status 'done'
Collecting huggingface-hub==0.11.1
Using cached huggingface_hub-0.11.1-py3-none-any.whl (182 kB)
Requirement already satisfied: idna==3.4 in ./env/lib/python3.10/site-packages (from -r requirements.txt (line 26)) (3.4)
Collecting ipykernel==6.20.1
Using cached ipykernel-6.20.1-py3-none-any.whl (149 kB)
Collecting ipython==8.8.0
Using cached ipython-8.8.0-py3-none-any.whl (775 kB)
Collecting jedi==0.18.2
Using cached jedi-0.18.2-py2.py3-none-any.whl (1.6 MB)
Collecting joblib==1.2.0
Using cached joblib-1.2.0-py3-none-any.whl (297 kB)
Collecting jupyter_client==7.4.8
Using cached jupyter_client-7.4.8-py3-none-any.whl (133 kB)
Collecting jupyter_core==5.1.3
Using cached jupyter_core-5.1.3-py3-none-any.whl (93 kB)
Collecting kiwisolver==1.4.4
Using cached kiwisolver-1.4.4-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (1.6 MB)
Collecting lxml==4.9.2
Using cached lxml-4.9.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl (7.1 MB)
Collecting matplotlib==3.6.3
Using cached matplotlib-3.6.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (11.8 MB)
Collecting matplotlib-inline==0.1.6
Using cached matplotlib_inline-0.1.6-py3-none-any.whl (9.4 kB)
Collecting multidict==6.0.4
Using cached multidict-6.0.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (114 kB)
Collecting nest-asyncio==1.5.6
Using cached nest_asyncio-1.5.6-py3-none-any.whl (5.2 kB)
Collecting numpy==1.24.1
Using cached numpy-1.24.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (17.3 MB)
Collecting openai==0.26.1
Using cached openai-0.26.1.tar.gz (55 kB)
Installing build dependencies: started
Installing build dependencies: finished with status 'done'
Getting requirements to build wheel: started
Getting requirements to build wheel: finished with status 'done'
Preparing metadata (pyproject.toml): started
Preparing metadata (pyproject.toml): finished with status 'done'
Collecting packaging==23.0
Using cached packaging-23.0-py3-none-any.whl (42 kB)
Collecting pandas==1.5.2
Using cached pandas-1.5.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (12.1 MB)
Collecting parso==0.8.3
Using cached parso-0.8.3-py2.py3-none-any.whl (100 kB)
Collecting pexpect==4.8.0
Using cached pexpect-4.8.0-py2.py3-none-any.whl (59 kB)
Collecting pickleshare==0.7.5
Using cached pickleshare-0.7.5-py2.py3-none-any.whl (6.9 kB)
Collecting Pillow==9.4.0
Using cached Pillow-9.4.0-cp310-cp310-manylinux_2_28_x86_64.whl (3.4 MB)
Collecting pipreqs==0.4.11
Using cached pipreqs-0.4.11-py2.py3-none-any.whl (32 kB)
Collecting platformdirs==2.6.2
Using cached platformdirs-2.6.2-py3-none-any.whl (14 kB)
Collecting plotly==5.12.0
Using cached plotly-5.12.0-py2.py3-none-any.whl (15.2 MB)
Collecting prompt-toolkit==3.0.36
Using cached prompt_toolkit-3.0.36-py3-none-any.whl (386 kB)
Collecting psutil==5.9.4
Using cached psutil-5.9.4-cp36-abi3-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (280 kB)
Collecting psycopg2==2.9.5
Using cached psycopg2-2.9.5.tar.gz (384 kB)
Preparing metadata (setup.py): started
Preparing metadata (setup.py): finished with status 'error'
error: subprocess-exited-with-error
× python setup.py egg_info did not run successfully.
│ exit code: 1
╰─> [23 lines of output]
running egg_info
creating /tmp/pip-pip-egg-info-f2skvrol/psycopg2.egg-info
writing /tmp/pip-pip-egg-info-f2skvrol/psycopg2.egg-info/PKG-INFO
writing dependency_links to /tmp/pip-pip-egg-info-f2skvrol/psycopg2.egg-info/dependency_links.txt
writing top-level names to /tmp/pip-pip-egg-info-f2skvrol/psycopg2.egg-info/top_level.txt
writing manifest file '/tmp/pip-pip-egg-info-f2skvrol/psycopg2.egg-info/SOURCES.txt'
Error: pg_config executable not found.
pg_config is required to build psycopg2 from source. Please add the directory
containing pg_config to the $PATH or specify the full executable path with the
option:
python setup.py build_ext --pg-config /path/to/pg_config build ...
or with the pg_config option in 'setup.cfg'.
If you prefer to avoid building psycopg2 from source, please install the PyPI
'psycopg2-binary' package instead.
For further information please check the 'doc/src/install.rst' file (also at
<https://www.psycopg.org/docs/install.html>).
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed
× Encountered error while generating package metadata.
╰─> See above for output.
note: This is an issue with the package mentioned above, not pip.
hint: See above for details.
ubuntu 22.04.2
Readme speaks of an output in the data phase, but no such code to print this exists.
Works, just might make people think it didn't.
Saving to CSV...
Loading tokenizer...
Embedding text...
Connecting to database...
Done!
Without upgrading the packages, I get:
error - node_modules/@neondatabase/serverless/index.js (43:0) @ <unknown>
error - p.allocUnsafe is not a function
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.