From nobody Thu Feb 15 11:25:10 2024 X-Original-To: dev-commits-ports-all@mlmmj.nyi.freebsd.org Received: from mx1.freebsd.org (mx1.freebsd.org [IPv6:2610:1c1:1:606c::19:1]) by mlmmj.nyi.freebsd.org (Postfix) with ESMTP id 4TbCQt5tzVz52HR0; Thu, 15 Feb 2024 11:25:10 +0000 (UTC) (envelope-from git@FreeBSD.org) Received: from mxrelay.nyi.freebsd.org (mxrelay.nyi.freebsd.org [IPv6:2610:1c1:1:606c::19:3]) (using TLSv1.3 with cipher TLS_AES_256_GCM_SHA384 (256/256 bits) key-exchange X25519 server-signature RSA-PSS (4096 bits) server-digest SHA256 client-signature RSA-PSS (4096 bits) client-digest SHA256) (Client CN "mxrelay.nyi.freebsd.org", Issuer "R3" (verified OK)) by mx1.freebsd.org (Postfix) with ESMTPS id 4TbCQt28w5z4xQ4; Thu, 15 Feb 2024 11:25:10 +0000 (UTC) (envelope-from git@FreeBSD.org) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=freebsd.org; s=dkim; t=1707996310; h=from:from:reply-to:subject:subject:date:date:message-id:message-id: to:to:cc:mime-version:mime-version:content-type:content-type: content-transfer-encoding:content-transfer-encoding; bh=OghxDnuy/3KAfqCA9Pi02ndgZ2juKIHmfVMx6dttFcU=; b=h99NDIIM9KZo1+3t8EatjjYbI0QEWvKhUsUSghg25HltOQh59Hg6WHKA3ucpsCm0MsRvJY p0B0G0HRS4X3pq0vvYrm77Wtg+pXFXFlyuA0yGQQS69Rd3owQtfjKQL5l30WhExVlCuzqL auCSeQoWE8UusT16TG/fr1AL90RllHzQVVJUBqmrHOHPWdI+0lIkZgQlCGP4SmyCfLpiWN paSk5RpTjG8Nfsh3ATrx5kASjiT2G2dMIKg667guodtF7Qfa4WQ+a+3ooFCmk2GrdlTFK4 dJHdJD7j+pfXqa5MmkEND1PCX1NpS1uOuC9CDuXountiO/sCb56/EUXIkpaejw== ARC-Seal: i=1; s=dkim; d=freebsd.org; t=1707996310; a=rsa-sha256; cv=none; b=XFRvAUsXYhn1TLRX9V6PmgSZy9v2lhnzN3O/ygRLLZlS80yHykF9Ry6YEdo4Z1YEAiqIu6 EA87ddE+iuqVMOTcO50tOy+NljjdXm94VEU+Pt/6jxocLFMPQS6a2M/macZW43MN7aw6dJ je16xoFTh8hXFlpXM87tzEnqS06dfzXKAyLZ+Wvw2jNHfVcjb9eUdzHZmATEU8L02cq7L7 a2e4cPkhtNSQWDI0A91JSYzV95q1McI4WjGuCXGb6YdSx83ty+Ockn9d6Gk93ZWKXCFH5Q ZeCWs81CNG22sJyTPqbUhwp01WGlxA5S8Vv8Wxg5jcCXHyshzDc0m3XDUIO7Bg== ARC-Authentication-Results: i=1; mx1.freebsd.org; none ARC-Message-Signature: i=1; a=rsa-sha256; c=relaxed/relaxed; d=freebsd.org; s=dkim; t=1707996310; h=from:from:reply-to:subject:subject:date:date:message-id:message-id: to:to:cc:mime-version:mime-version:content-type:content-type: content-transfer-encoding:content-transfer-encoding; bh=OghxDnuy/3KAfqCA9Pi02ndgZ2juKIHmfVMx6dttFcU=; b=toY2MXW+QTMuDjnPa1rNeMCRvFG2jadxqHMnojzhV/4TE310P3OCvoHM4VMu17vo/g4+8p bgTvk952b12xJw6lVZuzcMZQBtzlTGXCj+yP7Ik4yUYiaFgp/+qK/3GaQ5nER9XcB0iw7+ gA3+/rhI3hhAomP8MJbyT4G2bq7Uu+ai4Mbt80kInUrs8gYkaSgs6T/54Lkwb+lZfrsdOY AnynZp3E9SvYgqv7Uw7pKboJTe65sAoAUalZV1d9uHqB6L7+tXBZEoQE5R8XZJGpUKRcpI TQdEZLGYB6ePJUe5fW1vIwCkmRJ8Eea5aB+kdvLj2rDmDXi0esDCrB135tXulQ== Received: from gitrepo.freebsd.org (gitrepo.freebsd.org [IPv6:2610:1c1:1:6068::e6a:5]) (using TLSv1.3 with cipher TLS_AES_256_GCM_SHA384 (256/256 bits) key-exchange X25519 server-signature RSA-PSS (4096 bits) server-digest SHA256) (Client did not present a certificate) by mxrelay.nyi.freebsd.org (Postfix) with ESMTPS id 4TbCQt18PfzffM; Thu, 15 Feb 2024 11:25:10 +0000 (UTC) (envelope-from git@FreeBSD.org) Received: from gitrepo.freebsd.org ([127.0.1.44]) by gitrepo.freebsd.org (8.17.1/8.17.1) with ESMTP id 41FBPAS5046159; Thu, 15 Feb 2024 11:25:10 GMT (envelope-from git@gitrepo.freebsd.org) Received: (from git@localhost) by gitrepo.freebsd.org (8.17.1/8.17.1/Submit) id 41FBPA0h046156; Thu, 15 Feb 2024 11:25:10 GMT (envelope-from git) Date: Thu, 15 Feb 2024 11:25:10 GMT Message-Id: <202402151125.41FBPA0h046156@gitrepo.freebsd.org> To: ports-committers@FreeBSD.org, dev-commits-ports-all@FreeBSD.org, dev-commits-ports-main@FreeBSD.org From: Yuri Victorovich Subject: git: d549d297aaf7 - main - misc/llama-cpp: New port: Facebook's LLaMA model in C/C++ List-Id: Commit messages for all branches of the ports repository List-Archive: https://lists.freebsd.org/archives/dev-commits-ports-all List-Help: List-Post: List-Subscribe: List-Unsubscribe: Sender: owner-dev-commits-ports-all@freebsd.org X-BeenThere: dev-commits-ports-all@freebsd.org MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 8bit X-Git-Committer: yuri X-Git-Repository: ports X-Git-Refname: refs/heads/main X-Git-Reftype: branch X-Git-Commit: d549d297aaf7d12246f16893f0521079da99de9e Auto-Submitted: auto-generated The branch main has been updated by yuri: URL: https://cgit.FreeBSD.org/ports/commit/?id=d549d297aaf7d12246f16893f0521079da99de9e commit d549d297aaf7d12246f16893f0521079da99de9e Author: Yuri Victorovich AuthorDate: 2024-02-15 11:24:32 +0000 Commit: Yuri Victorovich CommitDate: 2024-02-15 11:25:01 +0000 misc/llama-cpp: New port: Facebook's LLaMA model in C/C++ --- misc/Makefile | 1 + misc/llama-cpp/Makefile | 36 ++++++++++++++++++++++++++++++++++++ misc/llama-cpp/distinfo | 5 +++++ misc/llama-cpp/pkg-descr | 3 +++ misc/llama-cpp/pkg-plist | 38 ++++++++++++++++++++++++++++++++++++++ 5 files changed, 83 insertions(+) diff --git a/misc/Makefile b/misc/Makefile index c9a42d646543..345fe32c1080 100644 --- a/misc/Makefile +++ b/misc/Makefile @@ -245,6 +245,7 @@ SUBDIR += lifelines SUBDIR += lightgbm SUBDIR += lingoteach + SUBDIR += llama-cpp SUBDIR += locale-en_DK SUBDIR += localedata SUBDIR += logsurfer diff --git a/misc/llama-cpp/Makefile b/misc/llama-cpp/Makefile new file mode 100644 index 000000000000..3cabec77cc9a --- /dev/null +++ b/misc/llama-cpp/Makefile @@ -0,0 +1,36 @@ +PORTNAME= llama-cpp +DISTVERSIONPREFIX= b +DISTVERSION= 2144 +CATEGORIES= misc # machine-learning + +MAINTAINER= yuri@FreeBSD.org +COMMENT= Facebook's LLaMA model in C/C++ +WWW= https://github.com/ggerganov/llama.cpp + +LICENSE= MIT +LICENSE_FILE= ${WRKSRC}/LICENSE + +USES= cmake:testing python:run shebangfix +USE_LDCONFIG= yes + +USE_GITHUB= yes +GH_ACCOUNT= ggerganov +GH_PROJECT= llama.cpp +GH_TUPLE= nomic-ai:kompute:4565194:kompute/kompute + +SHEBANG_GLOB= *.py + +CMAKE_ON= BUILD_SHARED_LIBS +CMAKE_OFF= LLAMA_BUILD_TESTS +CMAKE_TESTING_ON= LLAMA_BUILD_TESTS + +LDFLAGS+= -pthread + +OPTIONS_DEFINE= EXAMPLES +OPTIONS_SUB= yes + +EXAMPLES_CMAKE_BOOL= LLAMA_BUILD_EXAMPLES + +BINARY_ALIAS= git=false + +.include diff --git a/misc/llama-cpp/distinfo b/misc/llama-cpp/distinfo new file mode 100644 index 000000000000..b11d5d4b173e --- /dev/null +++ b/misc/llama-cpp/distinfo @@ -0,0 +1,5 @@ +TIMESTAMP = 1707995361 +SHA256 (ggerganov-llama.cpp-b2144_GH0.tar.gz) = 679d2deb41b9df3d04bc5fb3b8fd255717009a08927962eca6476f26bff74731 +SIZE (ggerganov-llama.cpp-b2144_GH0.tar.gz) = 8562099 +SHA256 (nomic-ai-kompute-4565194_GH0.tar.gz) = 95b52d2f0514c5201c7838348a9c3c9e60902ea3c6c9aa862193a212150b2bfc +SIZE (nomic-ai-kompute-4565194_GH0.tar.gz) = 13540496 diff --git a/misc/llama-cpp/pkg-descr b/misc/llama-cpp/pkg-descr new file mode 100644 index 000000000000..176d4fa350e8 --- /dev/null +++ b/misc/llama-cpp/pkg-descr @@ -0,0 +1,3 @@ +The main goal of llama.cpp is to enable LLM inference with minimal setup and +state-of-the-art performance on a wide variety of hardware - locally and in +the cloud. diff --git a/misc/llama-cpp/pkg-plist b/misc/llama-cpp/pkg-plist new file mode 100644 index 000000000000..ced1b9e1d19c --- /dev/null +++ b/misc/llama-cpp/pkg-plist @@ -0,0 +1,38 @@ +%%EXAMPLES%%bin/baby-llama +%%EXAMPLES%%bin/batched +%%EXAMPLES%%bin/batched-bench +%%EXAMPLES%%bin/beam-search +%%EXAMPLES%%bin/benchmark +%%EXAMPLES%%bin/convert-llama2c-to-ggml +%%EXAMPLES%%bin/convert-lora-to-ggml.py +%%EXAMPLES%%bin/convert.py +%%EXAMPLES%%bin/embedding +%%EXAMPLES%%bin/export-lora +%%EXAMPLES%%bin/finetune +%%EXAMPLES%%bin/imatrix +%%EXAMPLES%%bin/infill +%%EXAMPLES%%bin/llama-bench +%%EXAMPLES%%bin/llava-cli +%%EXAMPLES%%bin/lookahead +%%EXAMPLES%%bin/lookup +%%EXAMPLES%%bin/main +%%EXAMPLES%%bin/parallel +%%EXAMPLES%%bin/passkey +%%EXAMPLES%%bin/perplexity +%%EXAMPLES%%bin/quantize +%%EXAMPLES%%bin/quantize-stats +%%EXAMPLES%%bin/save-load-state +%%EXAMPLES%%bin/server +%%EXAMPLES%%bin/simple +%%EXAMPLES%%bin/speculative +%%EXAMPLES%%bin/tokenize +%%EXAMPLES%%bin/train-text-from-scratch +include/ggml-alloc.h +include/ggml-backend.h +include/ggml.h +include/llama.h +lib/cmake/Llama/LlamaConfig.cmake +lib/cmake/Llama/LlamaConfigVersion.cmake +lib/libggml_shared.so +lib/libllama.so +%%EXAMPLES%%lib/libllava_shared.so