Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: 🐛 Llama Index imports and track costs and token counts in the class #47

Merged
merged 8 commits into from
Jun 4, 2024

Conversation

AndreCNF
Copy link
Contributor

This pull request fixes the Llama Index imports in the code and adds functionality to track costs and token counts in the TokenCostHandler class. The TokenCostHandler class now includes variables to store the prompt cost, completion cost, prompt tokens, and completion tokens. The calculate_prompt_cost and calculate_completion_cost functions have been updated to return the cost and number of tokens. Additionally, the reset_counts method has been added to reset the counts (e.g. at the end of each query of an agent, similarly to what is done with Llama Index's original token counting callback).

@areibman
Copy link
Contributor

Thanks for raising this! I’m not able to run and test for another day or so, but can you please also run the unit tests to account for the num tokens change? I believe these are breaking changes

@AndreCNF
Copy link
Contributor Author

Thanks for raising this! I’m not able to run and test for another day or so, but can you please also run the unit tests to account for the num tokens change? I believe these are breaking changes

I've made some extra changes so that the unit tests pass again ✅

@areibman
Copy link
Contributor

Thanks for raising this! I’m not able to run and test for another day or so, but can you please also run the unit tests to account for the num tokens change? I believe these are breaking changes

I've made some extra changes so that the unit tests pass again ✅

Thanks for this! Overall, I think the LlamaIndex stuff should work, but I'd prefer not to merge the new way of returning token counts + cost in the single function. We already have token counting functionality as separate functions, so there's no need to overload what we already have.

The main issue is that these changes will break every other users' installations of tokencost because the cost estimators will return tuples instead of decimals.

One possible alternative is creating a function called "calculate_everything" (or something like that) that returns a dict/tuple of all the information.

…riginal `calculate_prompt_cost` and `calculate_completion_cost` functions
@AndreCNF
Copy link
Contributor Author

AndreCNF commented Jun 3, 2024

Thanks for raising this! I’m not able to run and test for another day or so, but can you please also run the unit tests to account for the num tokens change? I believe these are breaking changes

I've made some extra changes so that the unit tests pass again ✅

Thanks for this! Overall, I think the LlamaIndex stuff should work, but I'd prefer not to merge the new way of returning token counts + cost in the single function. We already have token counting functionality as separate functions, so there's no need to overload what we already have.

The main issue is that these changes will break every other users' installations of tokencost because the cost estimators will return tuples instead of decimals.

One possible alternative is creating a function called "calculate_everything" (or something like that) that returns a dict/tuple of all the information.

My latest commit reverts back the breaking changes and implements a calculate_all_costs_and_tokens that is used in the Llama Index callback.

Let me know if there are still any issues to resolve.

README.md Outdated Show resolved Hide resolved
tokencost/callbacks/llama_index.py Show resolved Hide resolved
@areibman areibman merged commit d5c6abf into AgentOps-AI:main Jun 4, 2024
1 check passed
@areibman
Copy link
Contributor

areibman commented Jun 4, 2024

Chanmpion. Thank you so much @AndreCNF !

@AndreCNF AndreCNF deleted the fix/llama_index_callback_agents branch June 4, 2024 11:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants