Skip to content

Commit

Permalink
Release 1.3.4 (#100)
Browse files Browse the repository at this point in the history
* Pagination bug

* Bug fix

* chore: add docker cmd

* Compatibility fixes for SDK version 2.0.0 (#69)

* Pagination bug

* Bug fix

* Fix for schema changes

* Render tool calling

* Support for Langgraph, Qdrant & Groq  (#73)

* Pagination bug

* Bug fix

* Add langgraph support

* QDrant support

* Add Groq support

* update README

* update README

* feat: optimise docker image for self host setup

* adding api access to traces endpoint

* clean up

* refactor

* feat: add clickhouse db create on app start (#79)

* docs: add railway deploy, fix sdk badges (#81)

* untrack .env

* Revert "untrack .env"

This reverts commit 4551d7e.

* Playground and Prompt Management (#83)

* Pagination bug

* Bug fix

* Playground - basic implementation

* Playground - streaming and nonstreaming

* Move playground inside project

* API key flow

* Api key

* Playground refactor

* Add chat hookup

* anthropic streaming support

* Bug fixes to openai playground

* Anthropic bugfixes

* Anthropic bugfix

* Cohere first iteration

* Cohere role fixes

* Cohere api fix

* Parallel running

* Playground cost calculation non streaming

* playground - streaming token calculation

* latency and cost

* Support for Groq

* Add model name

* Prompt management views

* Remove current promptset flow

* Prompt management API hooks

* Prompt registry final

* Playground bugfixes

* Bug fix playground

* Rearrange project nav

* Fix playground

* Fix prompts

* Bugfixes

* Minor fix

* Prompt versioning bugfix

* Bugfix

* fix: clickhouse table find queries (#82)

* Fix to surface multiple LLM requests inside LLM View (#84)

* Pagination bug

* Bug fix

* Fix for surfacing multiple LLM requests in LLMView

* Minor bugfixes (#86)

* Pagination bug

* Bug fix

* Bugfixes

* api to fetch promptset with prompt filters

* bug fixes

* fix invalid redirect

* fix invalid status code

* Project Switcher (#90)

* Pagination bug

* Bug fix

* Project switcher

* Feat: dataset download (#60)

* API: download dataset

* API: Download dataset

* updated download-dataset api

* Updated: download_dataset api

* Updated download dataset API

* Updated Download API: changed Response to Next Response, add a condition to ensure max page size is 500

* updated the download-dataset API: fixed the format and removed redundant lines of code

* Updated download_daatset API: file name and removed 'id' param

* Added the Download dataset button.

* Merged developemnt into my branch

* Updated button size

* Fixes

---------

Co-authored-by: Karthik Kalyanaraman <karthik@scale3labs.com>

* Update prompt registry with instructions to fetch prompts (#91)

* Pagination bug

* Bug fix

* Update prompt registry

* Minor bugfix (#94)

* Pagination bug

* Bug fix

* Minor bugfix

* chore: update github repo badges

* Add GPT4-O Pricing and Playground (#98)

* Pagination bug

* Bug fix

* Add GPT4-O support

* Add GPT4-O support

* Update cost

* Dylan/s3en 2234 add perplexity support to playground (#89)

* adding perplexity to playground types

* adding ui stuff:'

* adding perplexity chat api

* fixing perplexity model dropdown

---------

Co-authored-by: Karthik Kalyanaraman <karthik@scale3labs.com>

---------

Co-authored-by: Darshit Suratwala <darshit@scale3labs.com>
Co-authored-by: darshit-s3 <119623510+darshit-s3@users.noreply.github.com>
Co-authored-by: dylan <dylan@scale3labs.com>
Co-authored-by: dylanzuber-scale3 <116033320+dylanzuber-scale3@users.noreply.github.com>
Co-authored-by: Rohit Kadhe <rohit@scale3labs.com>
Co-authored-by: Rohit Kadhe <113367036+rohit-kadhe@users.noreply.github.com>
Co-authored-by: MayuriS24 <159064413+MayuriS24@users.noreply.github.com>
  • Loading branch information
8 people committed May 13, 2024
1 parent d3bc1ba commit 89480e7
Show file tree
Hide file tree
Showing 12 changed files with 609 additions and 6 deletions.
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,7 @@ yarn-error.log*

# local env files
.env.local
.env

# vercel
.vercel
Expand Down
8 changes: 5 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,10 +3,12 @@
## Open Source & Open Telemetry(OTEL) Observability for LLM applications

![Static Badge](https://img.shields.io/badge/License-AGPL--3.0-blue)
![NPM Version](https://img.shields.io/npm/v/%40langtrase%2Ftypescript-sdk?style=flat&logo=npm&label=%40langtrase%2Ftypescript-sdk&color=green&link=https%3A%2F%2Fgithub.mirror.nvdadr.com%2FScale3-Labs%2Flangtrace-typescript-sdk)
![PyPI - Version](https://img.shields.io/pypi/v/langtrace-python-sdk?style=flat&logo=python&label=langtrace-python-sdk&color=green&link=https%3A%2F%2Fgithub.mirror.nvdadr.com%2FScale3-Labs%2Flangtrace-python-sdk)
[![NPM Typescript SDK](https://img.shields.io/npm/v/%40langtrase%2Ftypescript-sdk?style=flat&logo=npm&label=%40langtrase%2Ftypescript-sdk&color=green&link=https%3A%2F%2Fgithub.mirror.nvdadr.com%2FScale3-Labs%2Flangtrace-typescript-sdk)](https://github.com/Scale3-Labs/langtrace-typescript-sdk)
[![PyPI Python SDK](https://img.shields.io/pypi/v/langtrace-python-sdk?style=flat&logo=python&label=langtrace-python-sdk&color=green&link=https%3A%2F%2Fgithub.mirror.nvdadr.com%2FScale3-Labs%2Flangtrace-python-sdk)](https://github.com/Scale3-Labs/langtrace-python-sdk)
[![NPM Trace Attributes](https://img.shields.io/npm/v/%40langtrase%2Ftrace-attributes?style=flat&logo=npm&label=%40langtrase%2Ftrace-attributes&color=green&link=https%3A%2F%2Fgithub.mirror.nvdadr.com%2FScale3-Labs%2Flangtrace-trace-attributes)](https://github.com/Scale3-Labs/langtrace-trace-attributes)
[![PyPI Trace Attributes](https://img.shields.io/pypi/v/trace-attributes?style=flat&logo=python&label=trace-attributes&color=green&link=https%3A%2F%2Fgithub.mirror.nvdadr.com%2FScale3-Labs%2Flangtrace-trace-attributes)](https://github.com/Scale3-Labs/langtrace-trace-attributes)
![Static Badge](https://img.shields.io/badge/Development_status-Active-green)
[![Deploy on Railway](https://railway.app/button.svg)](https://railway.app/template/yZGbfC?referralCode=MA2S9H)
[![Deploy on Railway](https://railway.app/button.svg)](https://railway.app/template/8dNq1c?referralCode=MA2S9H)

---

Expand Down
12 changes: 12 additions & 0 deletions app/(protected)/project/[project_id]/playground/page.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@ import {
OpenAIChatInterface,
OpenAIModel,
OpenAISettings,
PerplexitySettings,
} from "@/lib/types/playground_types";
import Link from "next/link";
import { useState } from "react";
Expand Down Expand Up @@ -69,6 +70,17 @@ export default function Page() {
settings: settings,
};
setLLMs((currentLLMs) => [...currentLLMs, cohereChat]);
} else if (vendor === "perplexity") {
const settings: PerplexitySettings = {
messages: [],
model: "mistral-7b-instruct",
};
const perplexityChat: ChatInterface = {
id: uuidv4(),
vendor: "perplexity",
settings: settings,
};
setLLMs((currentLLMs) => [...currentLLMs, perplexityChat]);
}
};

Expand Down
35 changes: 35 additions & 0 deletions app/api/chat/perplexity/route.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
import { OpenAIStream, StreamingTextResponse } from "ai";
import { NextResponse } from "next/server";
import OpenAI from "openai";

export async function POST(req: Request) {
try {
const data = await req.json();
const isStream = data.stream;
const apiKey = data.apiKey;

delete data.apiKey;

const perplexity = new OpenAI({
apiKey: apiKey || "",
baseURL: "https://api.perplexity.ai/",
});

const response = await perplexity.chat.completions.create({
...data,
});

// Convert the response into a friendly text-stream
if (isStream) {
const stream = OpenAIStream(response as any);
return new StreamingTextResponse(stream);
}

return NextResponse.json(response);
} catch (error: any) {
return NextResponse.json({
error: error?.message || "Something went wrong",
status: error?.status || 500,
});
}
}
47 changes: 47 additions & 0 deletions components/playground/chat-handlers.ts
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@ import {
CohereChatInterface,
GroqChatInterface,
OpenAIChatInterface,
PerplexityChatInterface,
} from "@/lib/types/playground_types";

export async function openAIHandler(
Expand Down Expand Up @@ -299,3 +300,49 @@ export async function groqHandler(

return response;
}

export async function perplexityHandler(
llm: PerplexityChatInterface,
apiKey: string
): Promise<any> {
const body: any = {};
if (llm.settings.messages.length > 0) {
body.messages = llm.settings.messages.map((m) => {
return { content: m.content, role: m.role };
});
}
if (llm.settings.model) {
body.model = llm.settings.model;
}
if (llm.settings.temperature) {
body.temperature = llm.settings.temperature;
}
if (llm.settings.max_tokens) {
body.max_tokens = llm.settings.max_tokens;
}
if (llm.settings.frequency_penalty) {
body.frequency_penalty = llm.settings.frequency_penalty;
}
if (llm.settings.presence_penalty) {
body.presence_penalty = llm.settings.presence_penalty;
}
if (llm.settings.stream !== undefined) {
body.stream = llm.settings.stream;
}
if (llm.settings.top_p) {
body.top_p = llm.settings.top_p;
}

// Get the API key from the browser store
body.apiKey = apiKey;

const response = await fetch("/api/chat/perplexity", {
method: "POST",
headers: {
"Content-Type": "application/json",
},
body: JSON.stringify(body),
});

return response;
}
43 changes: 43 additions & 0 deletions components/playground/llmchat.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,8 @@ import {
OpenAIChatInterface,
OpenAIRole,
OpenAISettings,
PerplexityChatInterface,
PerplexitySettings,
} from "@/lib/types/playground_types";
import { calculatePriceFromUsage, calculateTokens } from "@/lib/utils";
import { ArrowTopRightIcon } from "@radix-ui/react-icons";
Expand All @@ -28,13 +30,15 @@ import {
cohereHandler,
groqHandler,
openAIHandler,
perplexityHandler,
} from "./chat-handlers";
import { Message } from "./common";
import {
AnthropicSettingsSheet,
CohereSettingsSheet,
GroqSettingsSheet,
OpenAISettingsSheet,
PerplexitySettingsSheet,
} from "./settings-sheet";

function identity<T>(value: T): T {
Expand Down Expand Up @@ -191,6 +195,15 @@ export default function LLMChat({
}}
/>
)}
{localLLM.vendor === "perplexity" && (
<PerplexitySettingsSheet
settings={localLLM.settings as PerplexitySettings}
setSettings={(updatedSettings: any) => {
setLocalLLM({ ...localLLM, settings: updatedSettings });
setLLM({ ...llm, settings: updatedSettings });
}}
/>
)}
</div>
<Button
type="button"
Expand Down Expand Up @@ -232,6 +245,11 @@ export default function LLMChat({
localLLM as GroqChatInterface,
apiKey || ""
);
} else if (localLLM.vendor === "perplexity") {
response = await perplexityHandler(
localLLM as PerplexityChatInterface,
apiKey || ""
);
}

if (localLLM.settings.stream === true) {
Expand Down Expand Up @@ -495,6 +513,31 @@ export default function LLMChat({
: "";
setCost(totalCost);
}
} else if (localLLM.vendor === "perplexity") {
if (data?.choices?.length > 0) {
if (data.choices[0]?.message?.content) {
message = data.choices[0].message.content;
}
}
if (data?.usage) {
const inputTokens = data.usage.prompt_tokens;
const outputTokens = data.usage.completion_tokens;
const vendor = localLLM.vendor;
const model = localLLM.settings.model;
const calculatedCost = calculatePriceFromUsage(
vendor,
model,
{
input_tokens: inputTokens,
output_tokens: outputTokens,
}
);
const totalCost =
calculatedCost.total.toFixed(6) !== "0.000000"
? `\$${calculatedCost.total.toFixed(6)}`
: "";
setCost(totalCost);
}
}
setLLM({
...llm,
Expand Down
3 changes: 3 additions & 0 deletions components/playground/model-dropdown.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,7 @@ import {
cohereModels,
groqModels,
openAIModels,
perplexityModels,
} from "@/lib/types/playground_types";
import { cn } from "@/lib/utils";

Expand All @@ -39,6 +40,8 @@ export function ModelsDropDown({
models = cohereModels;
} else if (vendor === "groq") {
models = groqModels;
} else if (vendor === "perplexity") {
models = perplexityModels;
}
return (
<Popover open={open} onOpenChange={setOpen}>
Expand Down
Loading

0 comments on commit 89480e7

Please sign in to comment.