Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enable cuda options in llama.cpp and bump its version #24683

Open
wants to merge 4 commits into
base: master
Choose a base branch
from

Conversation

RobinQu
Copy link

@RobinQu RobinQu commented Jul 22, 2024

Summary

Changes to recipe: llama.cpp

Motivation

  1. Add cuda options in conan recipe to compile with cuda enabled.
  2. Add version b3438

Details

  • in conandata.yml, I add source package and its sha256 checksum of b3438.
  • in conanfile.py, I add cuda options and set False as its default. If it's enabled, LLAMA_CUDA will be added to trigger compilation with CUDA in original CMakefiles.txt of llama.cpp.

@conan-center-bot

This comment has been minimized.

@RobinQu
Copy link
Author

RobinQu commented Jul 22, 2024

I am actually trying to get some help from former contributer of llama.cpp recipe, as I found some issues of compiling test_package with cuda=True.

ggml library it self is built ok, but the compiler complain about missing symbols of cuda for test_pakcage binary with both newer version and the former version of b3040.

Here is the complte logs: llama-cpp.log

@MartinDelille @czoido

Do you guys find any problems after LLAMA_CUDA is enabled?

@jcar87
Copy link
Contributor

jcar87 commented Jul 22, 2024

Hi @RobinQu , thanks for your contribution.

I suspect the issues you are experiencing are due to the following in the package_info() method:

self.cpp_info.components["llama"].libs = ["llama"]

it should probably be ["llama", "ggml"] (or the other way around) -

Edit: on second thought, it may need to be its own component

@czoido
Copy link
Contributor

czoido commented Jul 22, 2024

Hi @RobinQu,
Thanks for reporting, here are some changes similar to the ones proposed by @jcar87 that should fix the problem, I was going to open a PR to add a newer version but I wanted to verify that it works fine with cuda first, you can check with those changes if that works: https://github.com/conan-io/conan-center-index/compare/master...czoido:conan-center-index:new-llama-cpp?expand=1

@conan-center-bot

This comment has been minimized.

@RobinQu
Copy link
Author

RobinQu commented Jul 23, 2024

Hi @RobinQu , thanks for your contribution.

I suspect the issues you are experiencing are due to the following in the package_info() method:

self.cpp_info.components["llama"].libs = ["llama"]

it should probably be ["llama", "ggml"] (or the other way around) -

Edit: on second thought, it may need to be its own component

Tried to update components of llama to ["llama", "ggml"] and similar errors occured.

@RobinQu
Copy link
Author

RobinQu commented Jul 23, 2024

Hi @RobinQu, Thanks for reporting, here are some changes similar to the ones proposed by @jcar87 that should fix the problem, I was going to open a PR to add a newer version but I wanted to verify that it works fine with cuda first, you can check with those changes if that works: https://github.com/conan-io/conan-center-index/compare/master...czoido:conan-center-index:new-llama-cpp?expand=1

I tried code on your branch and llama.cpp won't build, but it should be the problem introduced in b3347 not the recipe.

BTW, build log is attached below.
llama-cpp-build.txt

I think the test_package would fail to build anyway as no other sailent changes have been made to the recipe.

@RobinQu
Copy link
Author

RobinQu commented Jul 23, 2024

I suspect there maybe some env issues on my server.

However I am testing the recipe in docker container with official nvidia/cuda:12.4.1-cudnn-devel-ubuntu22.04 which is official docker image provided by nvidia. And compiling and linking simple cuda applications is working fine in this image.

@valgur
Copy link
Contributor

valgur commented Jul 23, 2024

For shared=False to work you will need to either list cudart (and possibly other used CUDA libs) under cpp_info.system_libs or export a .cmake module that finds and includes CUDA::cudart as a link_libraries target for the appropriate llama.cpp CMake target. You can find an example for the latter in the stdgpu recipe and a few others.

@czoido
Copy link
Contributor

czoido commented Jul 23, 2024

Hi @RobinQu, Thanks for reporting, here are some changes similar to the ones proposed by @jcar87 that should fix the problem, I was going to open a PR to add a newer version but I wanted to verify that it works fine with cuda first, you can check with those changes if that works: https://github.com/conan-io/conan-center-index/compare/master...czoido:conan-center-index:new-llama-cpp?expand=1

I tried code on your branch and llama.cpp won't build, but it should be the problem introduced in b3347 not the recipe.

BTW, build log is attached below. llama-cpp-build.txt

I think the test_package would fail to build anyway as no other sailent changes have been made to the recipe.

Hi @RobinQu,
Please check this PR, you can try the changes there, still some more changes must be done to work with cuda: #24694

@conan-center-bot
Copy link
Collaborator

Conan v1 pipeline ❌

Failure in build 3 (d6aca872d5c28e4a2573c245bd9dc7c2d2af035e):

  • llama-cpp/b2038:
    Error running command conan export recipes/llama-cpp/all/conanfile.py llama-cpp/b2038@:

    [HOOK - conan-center.py] pre_export(): [DEPRECATED GLOBAL CPPSTD (KB-H001)] OK
    [HOOK - conan-center.py] pre_export(): [REFERENCE LOWERCASE (KB-H002)] OK
    [HOOK - conan-center.py] pre_export(): [RECIPE METADATA (KB-H003)] OK
    [HOOK - conan-center.py] pre_export(): [HEADER_ONLY, NO COPY SOURCE (KB-H005)] OK
    [HOOK - conan-center.py] pre_export(): [FPIC OPTION (KB-H006)] OK
    [HOOK - conan-center.py] pre_export(): [VERSION RANGES (KB-H008)] OK
    [HOOK - conan-center.py] pre_export(): [RECIPE FOLDER SIZE (KB-H009)] Total recipe size: 6.2216796875 KB
    [HOOK - conan-center.py] pre_export(): [RECIPE FOLDER SIZE (KB-H009)] OK
    [HOOK - conan-center.py] pre_export(): [EXPORT LICENSE (KB-H023)] exports: None
    [HOOK - conan-center.py] pre_export(): [EXPORT LICENSE (KB-H023)] exports: None
    [HOOK - conan-center.py] pre_export(): [EXPORT LICENSE (KB-H023)] OK
    [HOOK - conan-center.py] pre_export(): [TEST PACKAGE FOLDER (KB-H024)] OK
    [HOOK - conan-center.py] pre_export(): [META LINES (KB-H025)] OK
    [HOOK - conan-center.py] pre_export(): [CONAN CENTER INDEX URL (KB-H027)] OK
    [HOOK - conan-center.py] pre_export(): [CMAKE MINIMUM VERSION (KB-H028)] OK
    [HOOK - conan-center.py] pre_export(): [TEST PACKAGE - RUN ENVIRONMENT (KB-H029)] OK
    [HOOK - conan-center.py] pre_export(): [SYSTEM REQUIREMENTS (KB-H032)] OK
    [HOOK - conan-center.py] pre_export(): [CONANDATA.YML FORMAT (KB-H030)] OK
    [HOOK - conan-center.py] pre_export(): [TEST PACKAGE - NO IMPORTS() (KB-H034)] OK
    [HOOK - conan-center.py] pre_export(): [NO AUTHOR (KB-H037)] OK
    [HOOK - conan-center.py] pre_export(): [NOT ALLOWED ATTRIBUTES (KB-H039)] OK
    [HOOK - conan-center.py] pre_export(): [NO TARGET NAME (KB-H040)] OK
    [HOOK - conan-center.py] pre_export(): [NO REQUIRES.ADD() (KB-H044)] OK
    [HOOK - conan-center.py] pre_export(): [DELETE OPTIONS (KB-H045)] OK
    [HOOK - conan-center.py] pre_export(): [CMAKE VERBOSE MAKEFILE (KB-H046)] OK
    [HOOK - conan-center.py] pre_export(): [CMAKE VERSION REQUIRED (KB-H048)] OK
    [HOOK - conan-center.py] pre_export(): [CMAKE WINDOWS EXPORT ALL SYMBOLS (KB-H049)] OK
    [HOOK - conan-center.py] pre_export(): [DEFAULT OPTIONS AS DICTIONARY (KB-H051)] OK
    [HOOK - conan-center.py] pre_export(): [PRIVATE IMPORTS (KB-H053)] OK
    [HOOK - conan-center.py] pre_export(): [SINGLE REQUIRES (KB-H055)] OK
    [HOOK - conan-center.py] pre_export(): [TOOLS RENAME (KB-H057)] OK
    [HOOK - conan-center.py] pre_export(): [ILLEGAL CHARACTERS (KB-H058)] OK
    [HOOK - conan-center.py] pre_export(): [CLASS NAME (KB-H059)] OK
    [HOOK - conan-center.py] pre_export(): [NO CRLF (KB-H060)] OK
    [HOOK - conan-center.py] pre_export(): [NO BUILD SYSTEM FUNCTIONS (KB-H061)] OK
    [HOOK - conan-center.py] pre_export(): [TOOLS CROSS BUILDING (KB-H062)] OK
    [HOOK - conan-center.py] pre_export(): [INVALID TOPICS (KB-H064)] OK
    [HOOK - conan-center.py] pre_export(): [NO REQUIRED_CONAN_VERSION (KB-H065)] OK
    [HOOK - conan-center.py] pre_export(): [TEST_TYPE MANAGEMENT (KB-H068)] OK
    [HOOK - conan-center.py] pre_export(): [TEST PACKAGE - NO DEFAULT OPTIONS (KB-H069)] OK
    [HOOK - conan-center.py] pre_export(): [MANDATORY SETTINGS (KB-H070)] OK
    [HOOK - conan-center.py] pre_export(): [PYLINT EXECUTION (KB-H072)] OK
    [HOOK - conan-center.py] pre_export(): [REQUIREMENT OVERRIDE PARAMETER (KB-H075)] OK
    [HOOK - conan-center.py] pre_export(): [NO DANGLING PATCHES (KB-H078)] OK
    WARN: *** Conan 1 is legacy and on a deprecation path ***
    WARN: *** Please upgrade to Conan 2 ***
    [HOOK - conan-center.py] pre_export(): ERROR: [CONFIG.YML HAS NEW VERSION (KB-H052)] The version "b3438" exists in "conandata.yml" but not in "../config.yml", so it will not be built. Please update "../config.yml" to include newly added version "b3438". (https://github.com/conan-io/conan-center-index/blob/master/docs/error_knowledge_base.md#KB-H052-CONFIG.YML-HAS-NEW-VERSION) 
    ERROR: [HOOK - conan-center.py] pre_export(): Some checks failed running the hook, check the output
    
  • llama-cpp/b3040:
    Error running command conan export recipes/llama-cpp/all/conanfile.py llama-cpp/b3040@:

    [HOOK - conan-center.py] pre_export(): [DEPRECATED GLOBAL CPPSTD (KB-H001)] OK
    [HOOK - conan-center.py] pre_export(): [REFERENCE LOWERCASE (KB-H002)] OK
    [HOOK - conan-center.py] pre_export(): [RECIPE METADATA (KB-H003)] OK
    [HOOK - conan-center.py] pre_export(): [HEADER_ONLY, NO COPY SOURCE (KB-H005)] OK
    [HOOK - conan-center.py] pre_export(): [FPIC OPTION (KB-H006)] OK
    [HOOK - conan-center.py] pre_export(): [VERSION RANGES (KB-H008)] OK
    [HOOK - conan-center.py] pre_export(): [RECIPE FOLDER SIZE (KB-H009)] Total recipe size: 6.2216796875 KB
    [HOOK - conan-center.py] pre_export(): [RECIPE FOLDER SIZE (KB-H009)] OK
    [HOOK - conan-center.py] pre_export(): [EXPORT LICENSE (KB-H023)] exports: None
    [HOOK - conan-center.py] pre_export(): [EXPORT LICENSE (KB-H023)] exports: None
    [HOOK - conan-center.py] pre_export(): [EXPORT LICENSE (KB-H023)] OK
    [HOOK - conan-center.py] pre_export(): [TEST PACKAGE FOLDER (KB-H024)] OK
    [HOOK - conan-center.py] pre_export(): [META LINES (KB-H025)] OK
    [HOOK - conan-center.py] pre_export(): [CONAN CENTER INDEX URL (KB-H027)] OK
    [HOOK - conan-center.py] pre_export(): [CMAKE MINIMUM VERSION (KB-H028)] OK
    [HOOK - conan-center.py] pre_export(): [TEST PACKAGE - RUN ENVIRONMENT (KB-H029)] OK
    [HOOK - conan-center.py] pre_export(): [SYSTEM REQUIREMENTS (KB-H032)] OK
    [HOOK - conan-center.py] pre_export(): [CONANDATA.YML FORMAT (KB-H030)] OK
    [HOOK - conan-center.py] pre_export(): [TEST PACKAGE - NO IMPORTS() (KB-H034)] OK
    [HOOK - conan-center.py] pre_export(): [NO AUTHOR (KB-H037)] OK
    [HOOK - conan-center.py] pre_export(): [NOT ALLOWED ATTRIBUTES (KB-H039)] OK
    [HOOK - conan-center.py] pre_export(): [NO TARGET NAME (KB-H040)] OK
    [HOOK - conan-center.py] pre_export(): [NO REQUIRES.ADD() (KB-H044)] OK
    [HOOK - conan-center.py] pre_export(): [DELETE OPTIONS (KB-H045)] OK
    [HOOK - conan-center.py] pre_export(): [CMAKE VERBOSE MAKEFILE (KB-H046)] OK
    [HOOK - conan-center.py] pre_export(): [CMAKE VERSION REQUIRED (KB-H048)] OK
    [HOOK - conan-center.py] pre_export(): [CMAKE WINDOWS EXPORT ALL SYMBOLS (KB-H049)] OK
    [HOOK - conan-center.py] pre_export(): [DEFAULT OPTIONS AS DICTIONARY (KB-H051)] OK
    [HOOK - conan-center.py] pre_export(): [PRIVATE IMPORTS (KB-H053)] OK
    [HOOK - conan-center.py] pre_export(): [SINGLE REQUIRES (KB-H055)] OK
    [HOOK - conan-center.py] pre_export(): [TOOLS RENAME (KB-H057)] OK
    [HOOK - conan-center.py] pre_export(): [ILLEGAL CHARACTERS (KB-H058)] OK
    [HOOK - conan-center.py] pre_export(): [CLASS NAME (KB-H059)] OK
    [HOOK - conan-center.py] pre_export(): [NO CRLF (KB-H060)] OK
    [HOOK - conan-center.py] pre_export(): [NO BUILD SYSTEM FUNCTIONS (KB-H061)] OK
    [HOOK - conan-center.py] pre_export(): [TOOLS CROSS BUILDING (KB-H062)] OK
    [HOOK - conan-center.py] pre_export(): [INVALID TOPICS (KB-H064)] OK
    [HOOK - conan-center.py] pre_export(): [NO REQUIRED_CONAN_VERSION (KB-H065)] OK
    [HOOK - conan-center.py] pre_export(): [TEST_TYPE MANAGEMENT (KB-H068)] OK
    [HOOK - conan-center.py] pre_export(): [TEST PACKAGE - NO DEFAULT OPTIONS (KB-H069)] OK
    [HOOK - conan-center.py] pre_export(): [MANDATORY SETTINGS (KB-H070)] OK
    [HOOK - conan-center.py] pre_export(): [PYLINT EXECUTION (KB-H072)] OK
    [HOOK - conan-center.py] pre_export(): [REQUIREMENT OVERRIDE PARAMETER (KB-H075)] OK
    [HOOK - conan-center.py] pre_export(): [NO DANGLING PATCHES (KB-H078)] OK
    WARN: *** Conan 1 is legacy and on a deprecation path ***
    WARN: *** Please upgrade to Conan 2 ***
    [HOOK - conan-center.py] pre_export(): ERROR: [CONFIG.YML HAS NEW VERSION (KB-H052)] The version "b3438" exists in "conandata.yml" but not in "../config.yml", so it will not be built. Please update "../config.yml" to include newly added version "b3438". (https://github.com/conan-io/conan-center-index/blob/master/docs/error_knowledge_base.md#KB-H052-CONFIG.YML-HAS-NEW-VERSION) 
    ERROR: [HOOK - conan-center.py] pre_export(): Some checks failed running the hook, check the output
    

Note: To save resources, CI tries to finish as soon as an error is found. For this reason you might find that not all the references have been launched or not all the configurations for a given reference. Also, take into account that we cannot guarantee the order of execution as it depends on CI workload and workers availability.


Conan v2 pipeline ❌

Note: Conan v2 builds are now mandatory. Please read our discussion about it.

The v2 pipeline failed. Please, review the errors and note this is required for pull requests to be merged. In case this recipe is still not ported to Conan 2.x, please, ping @conan-io/barbarians on the PR and we will help you.

Failure in build 3 (d6aca872d5c28e4a2573c245bd9dc7c2d2af035e):

  • llama-cpp/b3040:
    CI failed to create some packages (All logs)

    Logs for packageID 01d597222e67ec84823526db99504e7b31bbab95:
    [settings]
    arch=x86_64
    build_type=Release
    compiler=gcc
    compiler.cppstd=17
    compiler.libcxx=libstdc++11
    compiler.version=11
    os=Linux
    [options]
    */*:shared=False
    
    [...]
    ======== Testing the package ========
    Removing previously existing 'test_package' build folder: /home/conan/workspace/prod-v2/bsr/cci-f44dde43/recipes/llama-cpp/all/test_package/build/gcc-11-x86_64-17-release
    llama-cpp/b3040 (test package): Test package build: build/gcc-11-x86_64-17-release
    llama-cpp/b3040 (test package): Test package build folder: /home/conan/workspace/prod-v2/bsr/cci-f44dde43/recipes/llama-cpp/all/test_package/build/gcc-11-x86_64-17-release
    llama-cpp/b3040 (test package): Writing generators to /home/conan/workspace/prod-v2/bsr/cci-f44dde43/recipes/llama-cpp/all/test_package/build/gcc-11-x86_64-17-release/generators
    llama-cpp/b3040 (test package): Generator 'CMakeDeps' calling 'generate()'
    llama-cpp/b3040 (test package): CMakeDeps necessary find_package() and targets for your CMakeLists.txt
        find_package(llama-cpp)
        target_link_libraries(... llama-cpp::llama-cpp)
    llama-cpp/b3040 (test package): Generator 'CMakeToolchain' calling 'generate()'
    llama-cpp/b3040 (test package): CMakeToolchain generated: conan_toolchain.cmake
    llama-cpp/b3040 (test package): CMakeToolchain generated: /home/conan/workspace/prod-v2/bsr/cci-f44dde43/recipes/llama-cpp/all/test_package/build/gcc-11-x86_64-17-release/generators/CMakePresets.json
    llama-cpp/b3040 (test package): CMakeToolchain generated: /home/conan/workspace/prod-v2/bsr/cci-f44dde43/recipes/llama-cpp/all/test_package/CMakeUserPresets.json
    llama-cpp/b3040 (test package): Generator 'VirtualRunEnv' calling 'generate()'
    llama-cpp/b3040 (test package): Generating aggregated env files
    llama-cpp/b3040 (test package): Generated aggregated env files: ['conanrun.sh', 'conanbuild.sh']
    
    ======== Testing the package: Building ========
    llama-cpp/b3040 (test package): Calling build()
    llama-cpp/b3040 (test package): Running CMake.configure()
    llama-cpp/b3040 (test package): RUN: cmake -G "Unix Makefiles" -DCMAKE_TOOLCHAIN_FILE="generators/conan_toolchain.cmake" -DCMAKE_INSTALL_PREFIX="/home/conan/workspace/prod-v2/bsr/cci-f44dde43/recipes/llama-cpp/all/test_package" -DCMAKE_POLICY_DEFAULT_CMP0091="NEW" -DCMAKE_BUILD_TYPE="Release" "/home/conan/workspace/prod-v2/bsr/cci-f44dde43/recipes/llama-cpp/all/test_package"
    -- Using Conan toolchain: /home/conan/workspace/prod-v2/bsr/cci-f44dde43/recipes/llama-cpp/all/test_package/build/gcc-11-x86_64-17-release/generators/conan_toolchain.cmake
    -- Conan toolchain: Defining architecture flag: -m64
    -- Conan toolchain: C++ Standard 17 with extensions OFF
    -- The CXX compiler identification is GNU 11.4.0
    -- Check for working CXX compiler: /usr/local/bin/c++
    -- Check for working CXX compiler: /usr/local/bin/c++ -- works
    -- Detecting CXX compiler ABI info
    -- Detecting CXX compiler ABI info - done
    -- Detecting CXX compile features
    -- Detecting CXX compile features - done
    -- Conan: Component target declared 'llama-cpp::llama'
    -- Conan: Component target declared 'llama-cpp::common'
    -- Conan: Target declared 'llama-cpp::llama-cpp'
    CMake Error at build/gcc-11-x86_64-17-release/generators/cmakedeps_macros.cmake:67 (message):
      Library 'ggml' not found in package.  If 'ggml' is a system library,
      declare it with 'cpp_info.system_libs' property
    Call Stack (most recent call first):
      build/gcc-11-x86_64-17-release/generators/llama-cpp-Target-release.cmake:23 (conan_package_library_targets)
      build/gcc-11-x86_64-17-release/generators/llama-cppTargets.cmake:24 (include)
      build/gcc-11-x86_64-17-release/generators/llama-cpp-config.cmake:16 (include)
      CMakeLists.txt:5 (find_package)
    
    
    -- Configuring incomplete, errors occurred!
    See also "/home/conan/workspace/prod-v2/bsr/cci-f44dde43/recipes/llama-cpp/all/test_package/build/gcc-11-x86_64-17-release/CMakeFiles/CMakeOutput.log".
    
    ERROR: llama-cpp/b3040 (test package): Error in build() method, line 21
    	cmake.configure()
    	ConanException: Error 1 while executing
    
  • llama-cpp/b2038:
    Didn't run or was cancelled before finishing


Note: To save resources, CI tries to finish as soon as an error is found. For this reason you might find that not all the references have been launched or not all the configurations for a given reference. Also, take into account that we cannot guarantee the order of execution as it depends on CI workload and workers availability.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants