by launching “axllm.py my.yaml”
try:
network_yaml_info = yaml_parser.get_network_yaml_info(
include_collections=['llm_local', 'llm_cards', 'llm_zoo']
)
parser = config.create_llm_argparser(
network_yaml_info, description='Perform LLM inference on an Axelera platform'
)
How to configure the yaml file
what’s inside?
can i specify like
llm_local = ip:port
llm_cards = /dev/metis0
llm_zoo = https://huggingface.com/my/llm.zip
…………
Or
llm_zoo:
- url: ${HUG_RELEASE_URL}/llm/myhillarius.zip
- url: ${HUG_RELEASE_URL}/llm/codellama:8B.zip
- url: ${HUG_RELEASE_URL}/llm/r1-1776:70B.zip
By duck tapeing models to my yaml

