Deploy TableLLM-13b-bpw3
Explore our Inference Catalog to deploy popular models on optimized configuration.
More options
Commit Revision optional
Specify a revision commit hash for the Hugging Face repository
Contact us if you'd like to request a custom solution or instance type.
N. Virginia us-east-1
You may want to select a GPU
accelerated instance to use the optimized Text Generation container.
Number of replicas
Automatically scale the number of replicas within Min and Max based on compute usage. Min is always 0 if Scale-To-Zero is active.
More options
Autoscaling Strategy
Control what type of trigger will cause your Endpoint to scale up.
A protected Endpoint is available from the Internet, secured with TLS/SSL and requires a valid Hugging Face Token for Authentication.
Default Env
Environment variables that will be provided to your container during deployment.
Key
Value
Secret Env
Same as Default, but people with access to this endpoint will not be able to read these values after creation.
Key
Value