Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Flair-light for production #2451

Closed
helpmefindaname opened this issue Sep 27, 2021 · 3 comments
Closed

Flair-light for production #2451

helpmefindaname opened this issue Sep 27, 2021 · 3 comments
Labels
wontfix This will not be worked on

Comments

@helpmefindaname
Copy link
Collaborator

(I don't assume that this will be implemented, but I would like to see, if others face similar problems/if there is general interest in doing something like that)

Is your feature/enhancement request related to a problem? Please describe.
I currently try to deploy a new model in production and I got the constraint of using only 1 GB for the size of my docker image.
My issue is, that using flair + fastAPI + uvicorn already takes 1.2 GB only for requirements (using pytorch-cpu only)

looking at the requirements.txt there are a lot of requirements that I wouldn't need for running prediction and plenty that are optional (conllu, huggingface_hub, wikipedia-api), depending on the embedding types (gensim is used for word/flair-embeddings only, langdetect only when using specific multi lang embeddings, huggingface when using TransformerEmbeddings, sklearn only when using tars or using evaluation, jamone only when using japanese tokenizers, ...)

I am pretty sure that I can reduce the image size small enough by excluding the model and reducing the requirements I don't need, however the way it is currently implemented, flair tries to load all (or most) of those libraries and fails if those are not installed.

Describe the solution you'd like
A light version of flair, that only implements prediction functions (no datasets, no training, no evaluation) and only imports external libraries upon use (e.g. import transformers in the constructor of TransformerWordEmbeddings) and specific extras (something like this so only what is actually required can be installed.

an alternative solution (that I wouldn't like) would be to integrate this directly in this repository, by only importing external libraries upon use, such that people could install flair without dependencies and install the requied ones manually

@alanakbik
Copy link
Collaborator

Good idea, and something to add to the list ;) We are slowly working towards a "Flair 1.0" release and once we get there that might be a good feature to include!

@ydennisy
Copy link

Are Flair models exportable as a PyTorch model? I think it would be nice to get the raw model into maybe an ONNX format and then serve.

@stale
Copy link

stale bot commented Feb 26, 2022

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the wontfix This will not be worked on label Feb 26, 2022
@stale stale bot closed this as completed Mar 6, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
wontfix This will not be worked on
Projects
None yet
Development

No branches or pull requests

3 participants