Comments (2)
Am I the only one that has continuous problems with this ? What am i doing wrong? I have set my API key idk how many times.
dot@meta:~/Cross-Link-Manager$ interpreter -y --model gpt-4o --max_tokens 4096 --context_window 10000 --max_output 10000 --force_task_completion --fast --auto_run
We have updated our profile file format. Would you like to migrate your profile file to the new format? No data will be lost.
(y/n) y
Migration complete.
▌ A new version of Open Interpreter is available.
▌ Please run: pip install --upgrade open-interpreter
────────────────────────────────────────────────────────────────────────────────────────────────────
> wget https://huggingface.co/jartine/Mixtral-8x7B-v0.1.llamafile/resolve/main/mixtral-8x7b-instruct-v0.1.Q5_K_M-server.llamafile
Traceback (most recent call last):
File "/home/dot/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 736, in completion
raise e
File "/home/dot/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 655, in completion
return self.streaming(
File "/home/dot/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 833, in streaming
response = openai_client.chat.completions.create(**data, timeout=timeout)
File "/home/dot/.local/lib/python3.10/site-packages/openai/_utils/_utils.py", line 277, in wrapper
return func(*args, **kwargs)
File "/home/dot/.local/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 606, in create
return self._post(
File "/home/dot/.local/lib/python3.10/site-packages/openai/_base_client.py", line 1240, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
File "/home/dot/.local/lib/python3.10/site-packages/openai/_base_client.py", line 921, in request
return self._request(
File "/home/dot/.local/lib/python3.10/site-packages/openai/_base_client.py", line 1020, in _request
raise self._make_status_error_from_response(err.response) from None
openai.AuthenticationError: Error code: 401 - {'error': {'message': 'Incorrect API key provided: x. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/dot/.local/lib/python3.10/site-packages/litellm/main.py", line 1112, in completion
raise e
File "/home/dot/.local/lib/python3.10/site-packages/litellm/main.py", line 1085, in completion
response = openai_chat_completions.completion(
File "/home/dot/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 742, in completion
raise OpenAIError(status_code=e.status_code, message=str(e))
litellm.llms.openai.OpenAIError: Error code: 401 - {'error': {'message': 'Incorrect API key provided: x. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/dot/.local/lib/python3.10/site-packages/interpreter/core/llm/llm.py", line 344, in fixed_litellm_completions
yield from litellm.completion(**params)
File "/home/dot/.local/lib/python3.10/site-packages/litellm/utils.py", line 3472, in wrapper
raise e
File "/home/dot/.local/lib/python3.10/site-packages/litellm/utils.py", line 3363, in wrapper
result = original_function(*args, **kwargs)
File "/home/dot/.local/lib/python3.10/site-packages/litellm/main.py", line 2480, in completion
raise exception_type(
File "/home/dot/.local/lib/python3.10/site-packages/litellm/utils.py", line 9927, in exception_type
raise e
File "/home/dot/.local/lib/python3.10/site-packages/litellm/utils.py", line 8500, in exception_type
raise AuthenticationError(
litellm.exceptions.AuthenticationError: AuthenticationError: OpenAIException - Error code: 401 - {'error': {'message': 'Incorrect API key provided: x. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/dot/.local/lib/python3.10/site-packages/interpreter/core/respond.py", line 78, in respond
for chunk in interpreter.llm.run(messages_for_llm):
File "/home/dot/.local/lib/python3.10/site-packages/interpreter/core/llm/llm.py", line 263, in run
yield from run_function_calling_llm(self, params)
File "/home/dot/.local/lib/python3.10/site-packages/interpreter/core/llm/run_function_calling_llm.py", line 44, in run_function_calling_llm
for chunk in llm.completions(**request_params):
File "/home/dot/.local/lib/python3.10/site-packages/interpreter/core/llm/llm.py", line 347, in fixed_litellm_completions
raise first_error
File "/home/dot/.local/lib/python3.10/site-packages/interpreter/core/llm/llm.py", line 328, in fixed_litellm_completions
yield from litellm.completion(**params)
File "/home/dot/.local/lib/python3.10/site-packages/litellm/utils.py", line 3472, in wrapper
raise e
File "/home/dot/.local/lib/python3.10/site-packages/litellm/utils.py", line 3363, in wrapper
result = original_function(*args, **kwargs)
File "/home/dot/.local/lib/python3.10/site-packages/litellm/main.py", line 2480, in completion
raise exception_type(
File "/home/dot/.local/lib/python3.10/site-packages/litellm/utils.py", line 9927, in exception_type
raise e
File "/home/dot/.local/lib/python3.10/site-packages/litellm/utils.py", line 8467, in exception_type
raise AuthenticationError(
litellm.exceptions.AuthenticationError: AuthenticationError: OpenAIException - Traceback (most recent call last):
File "/home/dot/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 736, in completion
raise e
File "/home/dot/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 655, in completion
return self.streaming(
File "/home/dot/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 813, in streaming
openai_client = self._get_openai_client(
File "/home/dot/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 550, in _get_openai_client
_new_client = openai(
File "/home/dot/.local/lib/python3.10/site-packages/openai/_client.py", line 104, in __init__
raise openaiError(
openai.openaiError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/bin/interpreter", line 8, in <module>
sys.exit(main())
File "/home/dot/.local/lib/python3.10/site-packages/interpreter/terminal_interface/start_terminal_interface.py", line 509, in main
start_terminal_interface(interpreter)
File "/home/dot/.local/lib/python3.10/site-packages/interpreter/terminal_interface/start_terminal_interface.py", line 475, in start_terminal_interface
interpreter.chat()
File "/home/dot/.local/lib/python3.10/site-packages/interpreter/core/core.py", line 200, in chat
for _ in self._streaming_chat(message=message, display=display):
File "/home/dot/.local/lib/python3.10/site-packages/interpreter/core/core.py", line 232, in _streaming_chat
yield from terminal_interface(self, message)
File "/home/dot/.local/lib/python3.10/site-packages/interpreter/terminal_interface/terminal_interface.py", line 133, in terminal_interface
for chunk in interpreter.chat(message, display=False, stream=True):
File "/home/dot/.local/lib/python3.10/site-packages/interpreter/core/core.py", line 271, in _streaming_chat
yield from self._respond_and_store()
File "/home/dot/.local/lib/python3.10/site-packages/interpreter/core/core.py", line 321, in _respond_and_store
for chunk in respond(self):
File "/home/dot/.local/lib/python3.10/site-packages/interpreter/core/respond.py", line 101, in respond
raise Exception(
Exception: Traceback (most recent call last):
File "/home/dot/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 736, in completion
raise e
File "/home/dot/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 655, in completion
return self.streaming(
File "/home/dot/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 833, in streaming
response = openai_client.chat.completions.create(**data, timeout=timeout)
File "/home/dot/.local/lib/python3.10/site-packages/openai/_utils/_utils.py", line 277, in wrapper
return func(*args, **kwargs)
File "/home/dot/.local/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 606, in create
return self._post(
File "/home/dot/.local/lib/python3.10/site-packages/openai/_base_client.py", line 1240, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
File "/home/dot/.local/lib/python3.10/site-packages/openai/_base_client.py", line 921, in request
return self._request(
File "/home/dot/.local/lib/python3.10/site-packages/openai/_base_client.py", line 1020, in _request
raise self._make_status_error_from_response(err.response) from None
openai.AuthenticationError: Error code: 401 - {'error': {'message': 'Incorrect API key provided: x. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/dot/.local/lib/python3.10/site-packages/litellm/main.py", line 1112, in completion
raise e
File "/home/dot/.local/lib/python3.10/site-packages/litellm/main.py", line 1085, in completion
response = openai_chat_completions.completion(
File "/home/dot/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 742, in completion
raise OpenAIError(status_code=e.status_code, message=str(e))
litellm.llms.openai.OpenAIError: Error code: 401 - {'error': {'message': 'Incorrect API key provided: x. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/dot/.local/lib/python3.10/site-packages/interpreter/core/llm/llm.py", line 344, in fixed_litellm_completions
yield from litellm.completion(**params)
File "/home/dot/.local/lib/python3.10/site-packages/litellm/utils.py", line 3472, in wrapper
raise e
File "/home/dot/.local/lib/python3.10/site-packages/litellm/utils.py", line 3363, in wrapper
result = original_function(*args, **kwargs)
File "/home/dot/.local/lib/python3.10/site-packages/litellm/main.py", line 2480, in completion
raise exception_type(
File "/home/dot/.local/lib/python3.10/site-packages/litellm/utils.py", line 9927, in exception_type
raise e
File "/home/dot/.local/lib/python3.10/site-packages/litellm/utils.py", line 8500, in exception_type
raise AuthenticationError(
litellm.exceptions.AuthenticationError: AuthenticationError: OpenAIException - Error code: 401 - {'error': {'message': 'Incorrect API key provided: x. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/dot/.local/lib/python3.10/site-packages/interpreter/core/respond.py", line 78, in respond
for chunk in interpreter.llm.run(messages_for_llm):
File "/home/dot/.local/lib/python3.10/site-packages/interpreter/core/llm/llm.py", line 263, in run
yield from run_function_calling_llm(self, params)
File "/home/dot/.local/lib/python3.10/site-packages/interpreter/core/llm/run_function_calling_llm.py", line 44, in run_function_calling_llm
for chunk in llm.completions(**request_params):
File "/home/dot/.local/lib/python3.10/site-packages/interpreter/core/llm/llm.py", line 347, in fixed_litellm_completions
raise first_error
File "/home/dot/.local/lib/python3.10/site-packages/interpreter/core/llm/llm.py", line 328, in fixed_litellm_completions
yield from litellm.completion(**params)
File "/home/dot/.local/lib/python3.10/site-packages/litellm/utils.py", line 3472, in wrapper
raise e
File "/home/dot/.local/lib/python3.10/site-packages/litellm/utils.py", line 3363, in wrapper
result = original_function(*args, **kwargs)
File "/home/dot/.local/lib/python3.10/site-packages/litellm/main.py", line 2480, in completion
raise exception_type(
File "/home/dot/.local/lib/python3.10/site-packages/litellm/utils.py", line 9927, in exception_type
raise e
File "/home/dot/.local/lib/python3.10/site-packages/litellm/utils.py", line 8467, in exception_type
raise AuthenticationError(
litellm.exceptions.AuthenticationError: AuthenticationError: OpenAIException - Traceback (most recent call last):
File "/home/dot/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 736, in completion
raise e
File "/home/dot/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 655, in completion
return self.streaming(
File "/home/dot/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 813, in streaming
openai_client = self._get_openai_client(
File "/home/dot/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 550, in _get_openai_client
_new_client = openai(
File "/home/dot/.local/lib/python3.10/site-packages/openai/_client.py", line 104, in __init__
raise openaiError(
openai.openaiError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable
There might be an issue with your API key(s).
To reset your API key (we'll use OPENAI_API_KEY for this example, but you may need to reset your ANTHROPIC_API_KEY, HUGGINGFACE_API_KEY, etc):
Mac/Linux: 'export OPENAI_API_KEY=your-key-here'. Update your ~/.zshrc on MacOS or ~/.bashrc on Linux with the new key if it has already been persisted there.,
Windows: 'setx OPENAI_API_KEY your-key-here' then restart terminal.
dot@meta:~/Cross-Link-Manager$ wget https://huggingface.co/jartine/Mixtral-8x7B-v0.1.llamafile/resolve/main/mixtral-8x7b-instruct-v0.1.Q5_K_M-server.llamafile
Will not apply HSTS. The HSTS database must be a regular and non-world-writable file.
ERROR: could not open HSTS store at '/home/dot/.wget-hsts'. HSTS will be disabled.
--2024-06-19 16:35:48-- https://huggingface.co/jartine/Mixtral-8x7B-v0.1.llamafile/resolve/main/mixtral-8x7b-instruct-v0.1.Q5_K_M-server.llamafile
Resolving huggingface.co (huggingface.co)... 2600:9000:234c:dc00:17:b174:6d00:93a1, 2600:9000:234c:8e00:17:b174:6d00:93a1, 2600:9000:234c:e800:17:b174:6d00:93a1, ...
Connecting to huggingface.co (huggingface.co)|2600:9000:234c:dc00:17:b174:6d00:93a1|:443... connected.
HTTP request sent, awaiting response... 401 Unauthorized
Username/Password Authentication Failed.
dot@meta:~/Cross-Link-Manager$
from open-interpreter.
nothing happens
``
from open-interpreter.
Related Issues (20)
- `interpreter.computer.keyboard.write` method fails when run by a macOS background service HOT 1
- --os指令无法使用default.yaml HOT 5
- When running in Codespaces, it consistently goes off the rails after a few rounds of conversation
- HOW to upload a data file HOT 2
- Provides a freely managed context interface
- Should not crash from running out of money
- NameError: name 'code' is not defined
- websocket --server doesn't seem to be really asynchronous processing? HOT 2
- When I use interpreter.chat(stream=True), in what scenarios will type return 'image'? HOT 1
- Executing commands in terminal
- Open-interpreter can't read files in path containing CJK on windows
- [IPKernelApp] WARNING | Parent appears to have exited, shutting down.
- interpreter.exe fails to start on 0.3.0 HOT 5
- "Object of type PhiConfig is not JSON serializable" error on --vision parameter
- DeepSeek Integration: Improving Minor Details HOT 1
- Error gpt-4-vision-preview has been deprecated
- :brain: Memory after custom tool learning for later usage or community sharing
- one-line installer does not set up openinterpreter
- Claude 3.5 Sonnet cannot execute code
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from open-interpreter.