บทความ

กำลังแสดงโพสต์จาก กรกฎาคม, 2024

Llama3 Typhoon v1.5 (scb10x) LLM

รูปภาพ
Llama3 Typhoon v1.5 (scb10x) LLM   Typhoon-1.5 models come in 8B and 72B sizes These models are built on the 8B Llama3 and 72B Qwen base models,  8B weights released under the Meta Llama 3 Community License ; 72B weights released under the Tongyi Qianwen License Performance To gain insight into Typhoon’s performance, we evaluated it using multiple-choice exam: Language & Knowledge Capabilities: We assessed Typhoon on multiple-choice question answering datasets, including ThaiExam , M3Exam , and MMLU . The ThaiExam dataset was sourced from standard examinations in Thailand, including ONET, TGAT, TPAT, and A-Level. M3Exam is a benchmark for Southeast Asian countries, including Thailand. MMLU is a standard benchmark for language models in English. Typhoon-1.5X is an eXperimental model designed for application use cases, featuring improved capabilities in Retrieval-Augmented Generation (RAG) , constrained generation , and reasoning in order to achieve competitive performance in in

LLAMA-CPP-PYTHON on RTX4060 GPU

รูปภาพ
LLAMA-CPP-PYTHON on NVIDIA RTX4060 GPU We try to use llama-cpp-python library with many OS. Windows 11 Ubuntu 22.04 and Colab ( ubuntu 22.04 also ) Llama-cpp-python library with RTX4060 GPU on Windows11 Install NVDIA GPU Driver. we use driver version 537.24 then check nvidia-smi command for check your GPU. Install CUDA Toolkit.    https://developer.nvidia.com/cuda-downloads we use CUDA 12.1 version make sure your cuda is work fine with nvcc command. We use Python 3.11.7 check python and pip with command line. Install Pytorch  https://pytorch.org/get-started/locally/ check pip list and check cuda available on your GPU with python shell. if false it cannot use with NVIDIA GPU. find your problem.  Install llama-cpp-python  https://pypi.org/project/llama-cpp-python/ with CUDA 12.1 use this command. on Jul 2,2024 version 0.2.81 but we use only 0.2.75 it's work fine. ( 0.2.81 not test on Windows11 yet )  pip install llama-cpp-python \ --extra-index-url https://abetlen.github.io/ll