Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Deepseek Integration #1359

Closed
angelarias2014 opened this issue Sep 26, 2024 · 3 comments
Closed

[BUG] Deepseek Integration #1359

angelarias2014 opened this issue Sep 26, 2024 · 3 comments
Labels
bug Something isn't working

Comments

@angelarias2014
Copy link

angelarias2014 commented Sep 26, 2024

Description

Hi, somebody try to connect crewai with deepseek?

I´m trying with Langchain and LiteLLM components and cannot connect this model.

I update for new version 0.6.3.1 with LLM Class, and if config with Deepseek shows me an error and cannot connect with model.

Steps to Reproduce

The code example:

....
from crewai import LLM
....
....

Integración Modelo

deepseek_llm = LLM(
model="deepseek-chat",
temperature=1.3,
max_tokens=150,
base_url="https://api.deepseek.com",
api_key=DEEPSEEK_API_KEY
)
.....
.....
agent = Agent(
role=role,
goal=goal,
backstory=backstory,
verbose=True,
allow_delegation=True,
tools=selected_tools,
llm=deepseek_llm
)

Expected behavior

In DOCS, this configuration:
from crewai import Agent, LLM

llm = LLM(
model="gpt-4",
temperature=0.7,
base_url="https://api.openai.com/v1",
api_key="your-api-key-here"
)

agent = Agent(
role='Customized LLM Expert',
goal='Provide tailored responses',
backstory="An AI assistant with custom LLM settings.",
llm=llm
)

In my configuration:

from crewai import Agent, LLM

deepseek_llm = LLM(
model="deepseek-chat",
temperature=1.3,
max_tokens=150,
base_url="https://api.deepseek.com",
api_key=DEEPSEEK_API_KEY
)

        agent = Agent(
            role=role,
            goal=goal,
            backstory=backstory,
            verbose=True,
            allow_delegation=True,
            tools=selected_tools,
            llm=deepseek_llm
        )

Screenshots/Code snippets

error output:

Provider List: https://docs.litellm.ai/docs/providers

Provider List: https://docs.litellm.ai/docs/providers

Provider List: https://docs.litellm.ai/docs/providers

Provider List: https://docs.litellm.ai/docs/providers

Provider List: https://docs.litellm.ai/docs/providers

Provider List: https://docs.litellm.ai/docs/providers

error in console:
2024-09-26 17:23:29,861 - 140617412761152 - llm.py-llm:104 - ERROR: Failed to get supported params: argument of type 'NoneType' is not iterable
2024-09-26 17:23:29,865 - 140617412761152 - llm.py-llm:104 - ERROR: Failed to get supported params: argument of type 'NoneType' is not iterable
2024-09-26 17:23:30,173 - 140617412761152 - llm.py-llm:104 - ERROR: Failed to get supported params: argument of type 'NoneType' is not iterable
2024-09-26 17:23:30,175 - 140617412761152 - llm.py-llm:104 - ERROR: Failed to get supported params: argument of type 'NoneType' is not iterable
2024-09-26 17:23:30,181 - 140617412761152 - llm.py-llm:104 - ERROR: Failed to get supported params: argument of type 'NoneType' is not iterable
2024-09-26 17:23:30,182 - 140617412761152 - llm.py-llm:104 - ERROR: Failed to get supported params: argument of type 'NoneType' is not iterable

Operating System

Ubuntu 20.04

Python Version

3.10

crewAI Version

0.6.3.1

crewAI Tools Version

Virtual Environment

Venv

Evidence

2024-09-26 17:23:29,861 - 140617412761152 - llm.py-llm:104 - ERROR: Failed to get supported params: argument of type 'NoneType' is not iterable
2024-09-26 17:23:29,865 - 140617412761152 - llm.py-llm:104 - ERROR: Failed to get supported params: argument of type 'NoneType' is not iterable
2024-09-26 17:23:30,173 - 140617412761152 - llm.py-llm:104 - ERROR: Failed to get supported params: argument of type 'NoneType' is not iterable
2024-09-26 17:23:30,175 - 140617412761152 - llm.py-llm:104 - ERROR: Failed to get supported params: argument of type 'NoneType' is not iterable
2024-09-26 17:23:30,181 - 140617412761152 - llm.py-llm:104 - ERROR: Failed to get supported params: argument of type 'NoneType' is not iterable
2024-09-26 17:23:30,182 - 140617412761152 - llm.py-llm:104 - ERROR: Failed to get supported params: argument of type 'NoneType' is not iterable

Possible Solution

None

Additional context

I´m try with several configurations for connect crewai this Deepseek model.

With langchain library and LiteLLM library, but always shows me an error or of Library of Openai.

Please can someone help me

Thank you.

@angelarias2014 angelarias2014 added the bug Something isn't working label Sep 26, 2024
@joaomdmoura
Copy link
Collaborator

Version 0.64.0 is out and fixes this.

from crewai import LLM
#...
return Agent(
	config=self.agents_config['researcher'],
	tools=[FileReadTool()],
	llm=LLM(model="deepseek/deepseek-chat"),
	verbose=True
)
#...

I also set an env var DEEPSEEK_API_KEY

@joaomdmoura
Copy link
Collaborator

this was a great catch btw, sorry about the inconvenience 😅

@angelarias2014
Copy link
Author

Thank you very much, Joao. You are doing an incredible job with CrewAI / Muito obrigado, Joao. Você está fazendo um trabalho incrível com o CrewAI.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants