Deprecated: The each() function is deprecated. This message will be suppressed on further calls in /home/zhenxiangba/zhenxiangba.com/public_html/phproxy-improved-master/index.php on line 456
WithinUsAI/IBM-GPT5.4-Coder-1B · Hugging Face
[go: Go Back, main page]

Text Generation
Safetensors
English
granitemoehybrid
granite
ibm
full-finetune
dual-gpu
code
reasoning
conversational

IBM-GPT-5.4-Coder-1B

This model is a full fine-tuned derivative of ibm-granite/granite-4.0-1b.

Training setup:

  • Full model fine-tuning
  • No adapters
  • No LoRA
  • No QLoRA
  • Dual-GPU DDP training
Downloads last month
180
Safetensors
Model size
2B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for WithinUsAI/IBM-GPT5.4-Coder-1B

Finetuned
(6)
this model
Quantizations
2 models

Datasets used to train WithinUsAI/IBM-GPT5.4-Coder-1B