This is a rewrite of the RAMP (Rapid Assistance in Modelling the Pandemic) model

Overview

RAMP from scratch

This is a rewrite of the RAMP (Rapid Assistance in Modelling the Pandemic) model, based on the EcoTwins-withCommuting branch, in Rust.

Only the initialisation phase, which builds up a cache per study area, is ported right now.

Running the code

You will first need to install the latest version of Rust (1.57): https://www.rust-lang.org/tools/install

You can then build and run the project:

git clone https://github.com/dabreegster/rampfs/
cd rampfs
# This will take a few minutes the first time you do it, to build external dependencies
cargo build --release
# The executable on Linux and Mac is in ./target/release/ramp. On Windows, you
# probably have to add .exe
./target/release/ramp init west-yorkshire-small

This will download some large files the first time. If all succeeds, you should have processed_data/WestYorkshireSmall.bin as output, as well as lots of intermediate files in raw_data/.

(Note it'll fail the first time through and prompt you to manually run a Python script to get some QUANT data in a different format.)

Troubleshooting

The code depends on proj to transform coordinates. You may need to install additional dependencies to build it, like cmake. Please open an issue if you have any trouble!

Some tips for working with Rust

There are two equivalent ways to rebuild and then run the code. First:

cargo run --release -- init devon

The -- separates arguments to cargo, the Rust build tool, and arguments to the program itself. The second way:

cargo build --release
./target/debug/ramp init devon

You can build the code in two ways -- debug and release. There's a simple tradeoff -- debug mode is fast to build, but slow to run. Release mode is slow to build, but fast to run. For the RAMP codebase, since the input data is so large and the codebase so small, I'd recommend always using --release. If you want to use debug mode, just omit the flag.

Comments
  • Huge PR to bring the model to the current stage of the BMI study.

    Huge PR to bring the model to the current stage of the BMI study.

    • New functions in OCL code
    • Lot of modifications to run the new yml file, later we will replace that structure for the other areas.
    • The new_paramaters.yml file is the one to use it (temporally)
    • The model runs in headless and GUI mode using python headless.py -p config/new_parameters.yml
    opened by mfbenitezp 5
  • Envs with python from previous Conda Env and Python version

    Envs with python from previous Conda Env and Python version

    Testing in another MacOS, we find out that the python version and the way the Environment is created might affect the correct installation. 1) if users have python already from conda envs, which can be quite common, thepoetry installshows

    zsh:1: no such file or directory: /opt/concourse/worker/volumes/live/c1a1a6ef-e724-4ad9-52a7-d6d68451dacb/volume/python-split_1631807121927/_build_env/bin/llvm-ar
            zsh:1: no such file or directory: /opt/concourse/worker/volumes/live/c1a1a6ef-e724-4ad9-52a7-d6d68451dacb/volume/python-split_1631807121927/_build_env/bin/llvm-ar
            error: Command "/opt/concourse/worker/volumes/live/c1a1a6ef-e724-4ad9-52a7-d6d68451dacb/volume/python-split_1631807121927/_build_env/bin/llvm-ar rcs build/temp.macosx-10.9-x86_64-3.9/libnpymath.a build/temp.macosx-10.9-x86_64-3.9/numpy/core/src/npymath/npy_math.o build/temp.macosx-10.9-x86_64-3.9/build/src.macosx-10.9-x86_64-3.9/numpy/core/src/npymath/ieee754.o build/temp.macosx-10.9-x86_64-3.9/build/src.macosx-10.9-x86_64-3.9/numpy/core/src/npymath/npy_math_complex.o build/temp.macosx-10.9-x86_64-3.9/numpy/core/src/npymath/halffloat.o" failed with exit status 127
            ----------------------------------------
        ERROR: Command errored out with exit status 1: /Users/jding/opt/anaconda3/envs/ua/bin/python -u -c 'import io, os, sys, setuptools, tokenize; sys.argv[0] = '"'"'/private/var/folders/_p/_n8j38ls4wq303jqcy3mt0cr0000gr/T/pip-install-yt692vf0/numpy_846fe9e98d334320be8531dc9d52f948/setup.py'"'"'; __file__='"'"'/private/var/folders/_p/_n8j38ls4wq303jqcy3mt0cr0000gr/T/pip-install-yt692vf0/numpy_846fe9e98d334320be8531dc9d52f948/setup.py'"'"';f = getattr(tokenize, '"'"'open'"'"', open)(__file__) if os.path.exists(__file__) else io.StringIO('"'"'from setuptools import setup; setup()'"'"');code = f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' install --record /private/var/folders/_p/_n8j38ls4wq303jqcy3mt0cr0000gr/T/pip-record-_cno4xb0/install-record.txt --single-version-externally-managed --prefix /private/var/folders/_p/_n8j38ls4wq303jqcy3mt0cr0000gr/T/pip-build-env-iwnc5r0v/overlay --compile --install-headers /private/var/folders/_p/_n8j38ls4wq303jqcy3mt0cr0000gr/T/pip-build-env-iwnc5r0v/overlay/include/python3.9/numpy Check the logs for full command output.
        ----------------------------------------
      WARNING: Discarding file:///Users/jding/Library/Caches/pypoetry/artifacts/1b/f8/77/cd6cb033665a9495af3468fb24629d33fa626db7b10a8abeb47a016bf1/pandas-1.0.3.tar.gz. Command errored out with exit status 1: /Users/jding/opt/anaconda3/envs/ua/bin/python /private/var/folders/_p/_n8j38ls4wq303jqcy3mt0cr0000gr/T/pip-standalone-pip-p1dzj8yo/__env_pip__.zip/pip install --ignore-installed --no-user --prefix /private/var/folders/_p/_n8j38ls4wq303jqcy3mt0cr0000gr/T/pip-build-env-iwnc5r0v/overlay --no-warn-script-location --no-binary :none: --only-binary :none: -i https://pypi.org/simple -- setuptools wheel 'Cython>=0.29.13' 'numpy==1.13.3; python_version=='"'"'3.6'"'"' and platform_system!='"'"'AIX'"'"'' 'numpy==1.14.5; python_version>='"'"'3.7'"'"' and platform_system!='"'"'AIX'"'"'' 'numpy==1.16.0; python_version=='"'"'3.6'"'"' and platform_system=='"'"'AIX'"'"'' 'numpy==1.16.0; python_version>='"'"'3.7'"'"' and platform_system=='"'"'AIX'"'"'' Check the logs for full command output.
      ERROR: Command errored out with exit status 1: /Users/jding/opt/anaconda3/envs/ua/bin/python /private/var/folders/_p/_n8j38ls4wq303jqcy3mt0cr0000gr/T/pip-standalone-pip-p1dzj8yo/__env_pip__.zip/pip install --ignore-installed --no-user --prefix /private/var/folders/_p/_n8j38ls4wq303jqcy3mt0cr0000gr/T/pip-build-env-iwnc5r0v/overlay --no-warn-script-location --no-binary :none: --only-binary :none: -i https://pypi.org/simple -- setuptools wheel 'Cython>=0.29.13' 'numpy==1.13.3; python_version=='"'"'3.6'"'"' and platform_system!='"'"'AIX'"'"'' 'numpy==1.14.5; python_version>='"'"'3.7'"'"' and platform_system!='"'"'AIX'"'"'' 'numpy==1.16.0; python_version=='"'"'3.6'"'"' and platform_system=='"'"'AIX'"'"'' 'numpy==1.16.0; python_version>='"'"'3.7'"'"' and platform_system=='"'"'AIX'"'"'' Check the logs for full command output.
      
    
      at ~/.poetry/lib/poetry/utils/env.py:1195 in _run
          1191│                 output = subprocess.check_output(
          1192│                     cmd, stderr=subprocess.STDOUT, **kwargs
          1193│                 )
          1194│         except CalledProcessError as e:
        → 1195│             raise EnvCommandError(e, input=input_)
          1196│ 
          1197│         return decode(output)
          1198│ 
          1199│     def execute(self, bin, *args, **kwargs):
    

    then the issue was about the python version in which we got

    Command ['/Users/jding/Library/Caches/pypoetry/virtualenvs/ramp-u2zQ2mbi-py3.10/bin/pip', 'install', '--no-deps', '/Users/jding/Library/Caches/pypoetry/artifacts/1d/39/69/fd592161731e8c197899f57a8d2142e2e2fd24626c2ee55ba72942ec9f/pandas-1.0.3.tar.gz'] errored with the following return code 1, and output: 
      Processing /Users/jding/Library/Caches/pypoetry/artifacts/1d/39/69/fd592161731e8c197899f57a8d2142e2e2fd24626c2ee55ba72942ec9f/pandas-1.0.3.tar.gz
        Installing build dependencies: started
        Installing build dependencies: still running...
        Installing build dependencies: still running...
        Installing build dependencies: still running...
        Installing build dependencies: still running...
        Installing build dependencies: finished with status 'error'
        error: subprocess-exited-with-error
    
    × pip subprocess to install build dependencies did not run successfully.
      │ exit code: 1
      ╰─> [4308 lines of output]
          Ignoring numpy: markers 'python_version == "3.6" and platform_system != "AIX"' don't match your environment
          Ignoring numpy: markers 'python_version == "3.6" and platform_system == "AIX"' don't match your environment
          Ignoring numpy: markers 'python_version >= "3.7" and platform_system == "AIX"' don't match your environment
          Collecting setuptools
            Using cached setuptools-60.9.3-py3-none-any.whl (1.1 MB)
          Collecting wheel
            Using cached wheel-0.37.1-py2.py3-none-any.whl (35 kB)
          Collecting Cython>=0.29.13
            Using cached Cython-0.29.28-py2.py3-none-any.whl (983 kB)
          Collecting numpy==1.14.5
            Using cached numpy-1.14.5.zip (4.9 MB)
            Preparing metadata (setup.py): started
            Preparing metadata (setup.py): finished with status 'done'
          Building wheels for collected packages: numpy
            Building wheel for numpy (setup.py): started
            Building wheel for numpy (setup.py): still running...
            Building wheel for numpy (setup.py): still running...
            Building wheel for numpy (setup.py): still running...
            Building wheel for numpy (setup.py): still running...
            Building wheel for numpy (setup.py): finished with status 'error'
            error: subprocess-exited-with-error
          
            × python setup.py bdist_wheel did not run successfully.
            │ exit code: 1
    

    Then the solution was to install 3.8.8 version, to let poetry installcorrectly, and then we got the issue reported in #14 , we solved it as it was described there, and then we got the issue #15 where nothing much could be done. I guess there is a need to make clear in the Readme that users needs to install the ASPICS requirements out of any Conda env previously installed.

    opened by mfbenitezp 4
  • Build failed in MacOS

    Build failed in MacOS

    When I build cargo I got:

    error: failed to run custom build command for proj-sys v0.18.4 Caused by: process didn’t exit successfully: /Users/fbenitez/Documents/ResearchATI/EcoTwins_Rust/rampfs/target/release/build/proj-sys-913625d2b4418d78/build-script-build (exit status: 101) --- stdout cargo:rerun-if-env-changed=PROJ_NO_PKG_CONFIG cargo:rerun-if-env-changed=PKG_CONFIG_x86_64-apple-darwin cargo:rerun-if-env-changed=PKG_CONFIG_x86_64_apple_darwin cargo:rerun-if-env-changed=HOST_PKG_CONFIG cargo:rerun-if-env-changed=PKG_CONFIG cargo:rerun-if-env-changed=PROJ_STATIC cargo:rerun-if-env-changed=PROJ_DYNAMIC cargo:rerun-if-env-changed=PKG_CONFIG_ALL_STATIC cargo:rerun-if-env-changed=PKG_CONFIG_ALL_DYNAMIC cargo:rerun-if-env-changed=PKG_CONFIG_PATH_x86_64-apple-darwin cargo:rerun-if-env-changed=PKG_CONFIG_PATH_x86_64_apple_darwin cargo:rerun-if-env-changed=HOST_PKG_CONFIG_PATH cargo:rerun-if-env-changed=PKG_CONFIG_PATH cargo:rerun-if-env-changed=PKG_CONFIG_LIBDIR_x86_64-apple-darwin cargo:rerun-if-env-changed=PKG_CONFIG_LIBDIR_x86_64_apple_darwin cargo:rerun-if-env-changed=HOST_PKG_CONFIG_LIBDIR cargo:rerun-if-env-changed=PKG_CONFIG_LIBDIR cargo:rerun-if-env-changed=PKG_CONFIG_SYSROOT_DIR_x86_64-apple-darwin cargo:rerun-if-env-changed=PKG_CONFIG_SYSROOT_DIR_x86_64_apple_darwin cargo:rerun-if-env-changed=HOST_PKG_CONFIG_SYSROOT_DIR cargo:rerun-if-env-changed=PKG_CONFIG_SYSROOT_DIR CMAKE_TOOLCHAIN_FILE_x86_64-apple-darwin = None CMAKE_TOOLCHAIN_FILE_x86_64_apple_darwin = None HOST_CMAKE_TOOLCHAIN_FILE = None CMAKE_TOOLCHAIN_FILE = None CMAKE_GENERATOR_x86_64-apple-darwin = None CMAKE_GENERATOR_x86_64_apple_darwin = None HOST_CMAKE_GENERATOR = None CMAKE_GENERATOR = None CMAKE_PREFIX_PATH_x86_64-apple-darwin = None CMAKE_PREFIX_PATH_x86_64_apple_darwin = None HOST_CMAKE_PREFIX_PATH = None CMAKE_PREFIX_PATH = None CMAKE_x86_64-apple-darwin = None CMAKE_x86_64_apple_darwin = None HOST_CMAKE = None CMAKE = None running: “cmake” “/Users/fbenitez/.cargo/registry/src/github.com-1ecc6299db9ec823/proj-sys-0.18.4/PROJSRC/proj/proj-7.1.0" “-DBUILD_SHARED_LIBS=OFF” “-DBUILD_TESTING=OFF” “-DBUILD_CCT=OFF” “-DBUILD_CS2CS=OFF” “-DBUILD_GEOD=OFF” “-DBUILD_GIE=OFF” “-DBUILD_PROJ=OFF” “-DBUILD_PROJINFO=OFF” “-DBUILD_PROJSYNC=OFF” “-DENABLE_CURL=OFF” “-DENABLE_TIFF=OFF” “-DCMAKE_INSTALL_PREFIX=/Users/fbenitez/Documents/ResearchATI/EcoTwins_Rust/rampfs/target/release/build/proj-sys-216d9a73e00ccca2/out” “-DCMAKE_C_FLAGS= -ffunction-sections -fdata-sections -fPIC -m64 -arch x86_64” “-DCMAKE_C_COMPILER=/usr/bin/cc” “-DCMAKE_CXX_FLAGS= -ffunction-sections -fdata-sections -fPIC -m64 -arch x86_64” “-DCMAKE_CXX_COMPILER=/usr/bin/c++” “-DCMAKE_ASM_FLAGS= -ffunction-sections -fdata-sections -fPIC -m64 -arch x86_64” “-DCMAKE_ASM_COMPILER=/usr/bin/cc” “-DCMAKE_BUILD_TYPE=Release” --- stderr pkg-config unable to find existing libproj installation: Could not run “pkg-config” “--libs” “--cflags” “proj” “proj >= 7.1.0" The pkg-config command could not be found. Most likely, you need to install a pkg-config package for your OS. Try brew install pkg-config if you have Homebrew. If you’ve already installed it, ensure the pkg-config command is one of the directories in the PATH environment variable. If you did not expect this build to link to a pre-installed system library, then check documentation of the proj-sys crate for an option to build the library from source, or disable features or dependencies that require pkg-config. building libproj from source thread ‘main’ panicked at ' failed to execute command: No such file or directory (os error 2) is cmake not installed? build script failed, must exit now’, /Users/fbenitez/.cargo/registry/src/github.com-1ecc6299db9ec823/cmake-0.1.46/src/lib.rs:974:5 note: run with RUST_BACKTRACE=1 environment variable to display a backtrace warning: build failed, waiting for other jobs to finish... error: build failed

    opened by mfbenitezp 4
  • Optionally reduce obesity while converting snapshots. #24

    Optionally reduce obesity while converting snapshots. #24

    You can now specify --reduce_obesity while converting snapshots. The logic mimics https://github.com/Urban-Analytics/RAMP-UA/blob/c5ecee2ab8eebd2b9cffc41956c7c83195b709d1/coding/model/opencl/ramp/snapshot.py#L172. Note that the docstring in that reference code disagrees with the implementation. We're not transforming overweight.

    If there's any ambiguity about the transformation being done, we can write the simple unit test and show the mapping exactly.

    opened by dabreegster 3
  • Add an alternate conda environment for running the Python code on Mac

    Add an alternate conda environment for running the Python code on Mac

    I'm testing on Linux right now, and I can't get this working -- as it is, I hit "No compatible device found" trying to start OpenCL, even though the software drivers from pocl are installed. If I add ocl-icd-system, I get a segfault instead. Everything works fine in conda over in Ramp_UA ecotwins branch, and I haven't figured out the difference here yet.

    I won't have access to my Mac till tonight, but I'll try it there then

    opened by dabreegster 3
  • Python protobuf implementation can't handle large files

    Python protobuf implementation can't handle large files

    Google scale, eh? The London output proto is 3.2GB, and ParseFromString crashes. The gdb backtrace (poetry run gdb --args python convert_snapshot.py -i ../spc/data/output/london.pb -o data/processed_data/London/snapshot/cache.npz):

    #0  0x00007fffef440f57 in char const* google::protobuf::internal::ReadPackedVarintArray<google::protobuf::internal::VarintParser<unsigned long, false>(void*, char const*, google::protobuf::internal::ParseContext*)::{lambda(unsigned long)#1}>(char const*, char const*, google::protobuf::internal::VarintParser<unsigned long, false>(void*, char const*, google::protobuf::internal::ParseContext*)::{lambda(unsigned long)#1}) ()
       from /home/dabreegster/.cache/pypoetry/virtualenvs/aspics-8OI0A7SA-py3.8/lib/python3.8/site-packages/google/protobuf/pyext/_message.cpython-38-x86_64-linux-gnu.so
    #1  0x00007fffef441167 in char const* google::protobuf::internal::EpsCopyInputStream::ReadPackedVarint<google::protobuf::internal::VarintParser<unsigned long, false>(void*, char const*, google::protobuf::internal::ParseContext*)::{lambda(unsigned long)#1}>(char const*, google::protobuf::internal::VarintParser<unsigned long, false>(void*, char const*, google::protobuf::internal::ParseContext*)::{lambda(unsigned long)#1}) ()
       from /home/dabreegster/.cache/pypoetry/virtualenvs/aspics-8OI0A7SA-py3.8/lib/python3.8/site-packages/google/protobuf/pyext/_message.cpython-38-x86_64-linux-gnu.so
    #2  0x00007fffef43e532 in google::protobuf::internal::PackedUInt64Parser(void*, char const*, google::protobuf::internal::ParseContext*) ()
       from /home/dabreegster/.cache/pypoetry/virtualenvs/aspics-8OI0A7SA-py3.8/lib/python3.8/site-packages/google/protobuf/pyext/_message.cpython-38-x86_64-linux-gnu.so
    #3  0x00007fffef51d8ae in google::protobuf::internal::WireFormat::_InternalParseAndMergeField(google::protobuf::Message*, char const*, google::protobuf::internal::ParseContext*, unsigned long, google::protobuf::Reflection const*, google::protobuf::FieldDescriptor const*) ()
       from /home/dabreegster/.cache/pypoetry/virtualenvs/aspics-8OI0A7SA-py3.8/lib/python3.8/site-packages/google/protobuf/pyext/_message.cpython-38-x86_64-linux-gnu.so
    #4  0x00007fffef51f66a in google::protobuf::internal::WireFormat::_InternalParse(google::protobuf::Message*, char const*, google::protobuf::internal::ParseContext*) ()
       from /home/dabreegster/.cache/pypoetry/virtualenvs/aspics-8OI0A7SA-py3.8/lib/python3.8/site-packages/google/protobuf/pyext/_message.cpython-38-x86_64-linux-gnu.so
    

    I suspect the f.read() isn't managing to slurp in the entire file.

    opened by dabreegster 2
  • ">

    "prng.cl" file not found #include

    After running:

    poetry run python gui.py -p model_parameters/default.yml

    I got

    <program source>:3:10: fatal error: 'prng.cl' file not found
    #include "prng.cl"
    

    Apparently it has something with the headers and the way opencl read them in Mac. https://discourse.julialang.org/t/including-header-files-for-opencl-kernels/36551

    I need to do other stuff, but later I will try more stuff, but no reasonable explanation about this weird issue.

    opened by mfbenitezp 2
  • Issue with macOS and OpenGL library location in python Env

    Issue with macOS and OpenGL library location in python Env

    Apparently, BigSur is not longer support OpenGL library or other system libraries in the standard locations in the file system and instead uses a cache. When I run poetry run python gui.py -p model_parameters/default.yml I got ImportError: ('Unable to load OpenGL library', "dlopen(OpenGL, 0x000A):

    Earlier we found the way to address that uisng the suggestion from: https://github.com/PixarAnimationStudios/USD/issues/1372

    The way to address that was:

    • Find the file in OpenGL/platform/ctypesloader.py, in the python Env created by Poetry.

    • Update/Edit the line

    fullName = util.find\_library( name )

    to

    fullName = '/System/Library/Frameworks/OpenGL.framework/OpenGL'

    opened by mfbenitezp 2
  • Enhanced functions for get_symptomatic_prob_for_age and get_mortality_prob_for_age towards the BMI studies.

    Enhanced functions for get_symptomatic_prob_for_age and get_mortality_prob_for_age towards the BMI studies.

    • Integration of the enhanced functions in ramp_ua.cl
    • Still work in progress but the model runs in GUI and Headless mode, but the output data does not make sense., so following your suggestion @dabreegster I do the PR for quick review.
    • There are comments I need to keep to recall what I have done., do not remove it.
    • Sex and Origin attributes from SPC are now included in the snapshot, but I was not able to bring the new_bmi variable, if you can tell me what I did wrong, fantastics, now is using obesity as bmi, but we need the new_bmi.
    • Here we use the new_parameters.yml file.
    • I will do some polishing in the params.py once the estimations makes sense.
    opened by mfbenitezp 1
  • New set of parameters for a tuned model towards a BMI studies

    New set of parameters for a tuned model towards a BMI studies

    • New set of parameters in new_parameters.yml file
    • Changes in Loader, params, ramp_ua.cl
    • Updated Jupiter Notebook ( still work in progress)
    • load_msoa_location ( still in progress not finish but would be useful for notebooks)
    • Removing nightclubs category in most of the code, still persist in OC kernels.
    • and other bunch of small details, but functional model (headless), not tested in GUI.
    • Reads SPC data
    opened by mfbenitezp 1
  • New set of parameters

    New set of parameters

    Hey @dabreegster, when you have some time, what I'm trying to incorporate is this new set of parameters, and then I can also calibrate them. Any help would be appreciated.

    https://github.com/alan-turing-institute/uatk-aspics/blob/24273b30ed4c67bcd7d8869c35230881eb50131c/config/Rutland_Test.yml#L57

    opened by mfbenitezp 1
  • Update model of BMI associated risks

    Update model of BMI associated risks

    • Change BMI array from 12 parameters to 16, see fitData.csv attached.
    • Update every calculation of oddBMI from
    oddBMI = params->bmi_multipliers[originNew * 3] + params->bmi_multipliers[originNew * 3 + 1] * [!!! BMI VARIABLE !!!] + params->bmi_multipliers[originNew * 3 + 2] * pown([!!! BMI VARIABLE !!!],2);
    

    to

    oddBMI = params->bmi_multipliers[originNew * 4] + params->bmi_multipliers[originNew * 4 + 1] * [!!! BMI VARIABLE !!!] + params->bmi_multipliers[originNew * 4 + 2] * pown([!!! BMI VARIABLE !!!],2) + params->bmi_multipliers[originNew * 4 + 3] * pown([!!! BMI VARIABLE !!!],3);
    

    replacing [!!! BMI VARIABLE !!!] accordingly

    • update scenarios to BMI = 26.7 / if (new_bmi > 26.7) / if (new_bmi >= 27.7)
    • add else if (new_bmi > 50){ //oddBMI is calculated using 50 for [!!! BMI VARIABLE !!!]}
    opened by HSalat 9
  • Seeding with new parameters

    Seeding with new parameters

    https://github.com/alan-turing-institute/uatk-aspics/blob/bf59262366959d78ce06fc837485a2a70abba6c8/aspics/simulator.py#L283

    people_ages = self.start_snapshot.buffers.people_ages
    people_obesity = self.start_snapshot.buffers.people_obesity
    

    Replace with morbidity buffer (I guess incorrectly named age_morbidity_multipliers), let's call it people_morbidity

    symptomatic_prob = cov_params.age_morbidity_multipliers[ #TODO ask Hadrien, now we use "age_morbidity_multipliers", rather than symptomatic_probs
            min(math.floor(people_ages[i] / 10), 8)
    ]
    if people_obesity[i] > 2:
            symptomatic_prob = symptomatic_prob * cov_params.symptomatic_multiplier #TODO here we will use symptomatic_multiplier, rather than overweight_sympt_mplier
    

    Replace with:

    symptomatic_prob = people_morbidity[i]
    
    opened by HSalat 2
  • List or urgent Tasks towards more tuned ASPICS model

    List or urgent Tasks towards more tuned ASPICS model

    • [ ] Fix lockdown
    • [ ] Update area to Greater Manchester - Update seeding / start date / number of days
    • [ ] Run calibration number 1
    • [ ] Run calibration number 2
    • [ ] Run different scenarios
    • [ ] Finish the paper (Calibration), Contribute with Karyn's Paper (BMI)
    opened by mfbenitezp 0
  • Events model

    Events model

    At the end of def step() in simulator.py, we'll run the events logic. Based on the current date, go through each event and call findVisitors. Then run transmission logic in Python and upload the new disease statuses to OpenCL.

    So one decision: how do we draw visitors for the event? We can't just use the snapshot / OpenCL state, because we've lost household membership and other state by then. It also ties things to ASPICS too closely. So I think now ASPICS needs to read the snapshot file, the original protobuf file, and this optional events file. When we draw visitors, we also have to filter by ASPICS current state (dead or symptomatic people don't go to events).

    Need to filter events by the study area boundary.

    Code from @HSalat attached eventsHandler.py.txt

    opened by dabreegster 1
  • Calibrate with pyabc

    Calibrate with pyabc

    @mfbenitezp will work on this. I can help initially get things running with poetry if needed. If we're following https://github.com/Urban-Analytics/RAMP-UA/tree/master/experiments/calibration, maybe all of this has to run in a notebook

    opened by dabreegster 0
Releases(v0.0.1)
Owner
Dustin Carlino
Speculative cartographer
Dustin Carlino
Cleora AI is a general-purpose model for efficient, scalable learning of stable and inductive entity embeddings for heterogeneous relational data.

Cleora Cleora is a genus of moths in the family Geometridae. Their scientific name derives from the Ancient Greek geo γῆ or γαῖα "the earth", and metr

Synerise 405 Dec 20, 2022
Masked Language Model on Wasm

Masked Language Model on Wasm This project is for OPTiM TECH BLOG. Please see below: WebAssemblyを用いてBERTモデルをフロントエンドで動かす Demo Usage Build image docker

OPTiM Corporation 20 Sep 23, 2022
Docker for PyTorch rust bindings `tch`. Example of pretrain model.

tch-rs-pretrain-example-docker Docker for PyTorch rust bindings tch-rs. Example of pretrain model. Docker files support the following install libtorch

vaaaaanquish 5 Oct 7, 2022
A neural network model that can approximate any non-linear function by using the random search algorithm for the optimization of the loss function.

random_search A neural network model that can approximate any non-linear function by using the random search algorithm for the optimization of the los

ph04 2 Apr 1, 2022
m2cgen (Model 2 Code Generator) - is a lightweight library which provides an easy way to transpile trained statistical models into a native code

Transform ML models into a native code (Java, C, Python, Go, JavaScript, Visual Basic, C#, R, PowerShell, PHP, Dart, Haskell, Ruby, F#, Rust) with zero dependencies

Bayes' Witnesses 2.3k Dec 31, 2022
Using OpenAI Codex's "davinci-edit" Model for Gradual Type Inference

OpenTau: Using OpenAI Codex for Gradual Type Inference Current implementation is focused on TypeScript Python implementation comes next Requirements r

Gamma Tau 11 Dec 18, 2022
Your one stop CLI for ONNX model analysis.

Your one stop CLI for ONNX model analysis. Featuring graph visualization, FLOP counts, memory metrics and more! ⚡️ Quick start First, download and ins

Christopher Fleetwood 20 Dec 30, 2022
Python+Rust implementation of the Probabilistic Principal Component Analysis model

Probabilistic Principal Component Analysis (PPCA) model This project implements a PPCA model implemented in Rust for Python using pyO3 and maturin. In

FindHotel 11 Dec 16, 2022
A demo repo that shows how to use the latest component model feature in wasmtime to implement a key-value capability defined in a WIT file.

Key-Value Component Demo This repo serves as an example of how to use the latest wasm runtime wasmtime and its component-model feature to build and ex

Jiaxiao Zhou 3 Dec 20, 2022
Experimenting with Rust's fundamental data model

ferrilab Redefining the Rust fundamental data model bitvec funty radium Introduction The ferrilab project is a collection of crates that provide more

Rusty Bit-Sequences 13 Dec 13, 2022
Library for the Standoff Text Annotation Model, in Rust

STAM Library STAM is a data model for stand-off text annotation and described in detail here. This is a sofware library to work with the model, writte

annotation 3 Jan 11, 2023
Believe in AI democratization. llama for nodejs backed by llama-rs, work locally on your laptop CPU. support llama/alpaca model.

llama-node Large Language Model LLaMA on node.js This project is in an early stage, the API for nodejs may change in the future, use it with caution.

Genkagaku.GPT 145 Apr 10, 2023
A rust implementation of the csl-next model.

Vision This is a project to write the CSL-Next typescript model and supporting libraries and tools in Rust, and convert to JSON Schema from there. At

Bruce D'Arcus 4 Jun 13, 2023
WebAssembly component model implementation for any backend.

wasm_component_layer wasm_component_layer is a runtime agnostic implementation of the WebAssembly component model. It supports loading and linking WAS

Douglas Dwyer 11 Aug 28, 2023
A Voice Activity Detector rust library using the Silero VAD model.

Voice Activity Detector Provides a model and extensions for detecting speech in audio. Standalone Voice Activity Detector This crate provides a standa

Nick Keenan 3 Apr 3, 2024
Tight Model format is a lossy 3D model format focused on reducing file size as much as posible without decreasing visual quality of the viewed model or read speeds.

What is Tight Model Format The main goal of the tmf project is to provide a way to save 3D game assets compressed in such a way, that there are no not

null 59 Mar 6, 2023
Stepper Acceleration Ramp Generator

RampMaker - Stepper Acceleration Ramp Generator Please consider supporting this project financially. More information below. About RampMaker provides

Flott - Motion Control in Rust 25 Aug 28, 2022
ARIMA modelling for Rust

ARIMA Rust crate for ARIMA model coefficient estimation and simulation. Example extern crate rand; use rand::prelude::*; use rand::distributions::{Nor

Kai Fricke 19 Dec 14, 2022
Blackjack is a procedural modelling application, following the steps of great tools like Houdini or Blender's geometry nodes project

Blackjack Your Rusty ?? procedural 3d modeler Blackjack is a procedural modelling application, following the steps of great tools like Houdini or Blen

null 1.1k Jan 3, 2023
Resim rapid instantiator

revup Install Linux First build the executable cargo build --release Then move or copy the binary to your preferred executables folder. For example: s

null 17 Apr 28, 2022