Skip to main content

Deploy details

Deploy successful for lfailandscape

Update Landscape from LFX 2025-01-11 (#835)

Production: main@8df7a91

Deploy log

Initializing

Complete
4:27:47 AM: Build ready to start
4:28:21 AM: build-image version: f648e6419ded16d58b87206d1c66874e8172841d (focal)
4:28:21 AM: buildbot version: 036f4db3997c007a7ee0f1b16bcde8318de8beac
4:28:21 AM: Fetching cached dependencies
4:28:21 AM: Starting to download cache of 197.7MB
4:28:22 AM: Finished downloading cache in 1.363s
4:28:22 AM: Starting to extract cache
4:28:25 AM: Finished extracting cache in 2.653s
4:28:25 AM: Finished fetching cache in 4.075s
4:28:25 AM: Starting to prepare the repo for build
4:28:25 AM: Preparing Git Reference refs/heads/main
4:28:26 AM: Custom build path detected. Proceeding with the specified path: 'netlify'
4:28:26 AM: Custom functions path detected. Proceeding with the specified path: 'functions'
4:28:26 AM: Custom build command detected. Proceeding with the specified command: '(wget --no-check-certificate --no-cache https://raw.githubusercontent.com/cncf/landscapeapp/master/netlify/landscape.js) && node landscape.js'
4:28:26 AM: Custom ignore command detected. Proceeding with the specified command: 'false'
4:28:27 AM: manpath: warning: $PATH not set
4:28:27 AM: Starting to install dependencies
4:28:28 AM: Started restoring cached mise cache
4:28:28 AM: Finished restoring cached mise cache
4:28:29 AM: mise python@3.13.1 install
4:28:29 AM: mise python@3.13.1 download cpython-3.13.1+20250106-x86_64-unknown-linux-gnu-install_only_stripped.tar.gz
4:28:29 AM: mise python@3.13.1 extract cpython-3.13.1+20250106-x86_64-unknown-linux-gnu-install_only_stripped.tar.gz
4:28:30 AM: mise python@3.13.1 python --version
4:28:30 AM: mise python@3.13.1 Python 3.13.1
4:28:30 AM: mise python@3.13.1 installed
4:28:30 AM: Python version set to 3.13
4:28:32 AM: Collecting pipenv
4:28:32 AM: Downloading pipenv-2024.4.0-py3-none-any.whl.metadata (19 kB)
4:28:32 AM: Collecting certifi (from pipenv)
4:28:32 AM: Downloading certifi-2024.12.14-py3-none-any.whl.metadata (2.3 kB)
4:28:32 AM: Collecting packaging>=22 (from pipenv)
4:28:32 AM: Downloading packaging-24.2-py3-none-any.whl.metadata (3.2 kB)
4:28:32 AM: Collecting setuptools>=67 (from pipenv)
4:28:32 AM: Downloading setuptools-75.8.0-py3-none-any.whl.metadata (6.7 kB)
4:28:32 AM: Collecting virtualenv>=20.24.2 (from pipenv)
4:28:32 AM: Downloading virtualenv-20.28.1-py3-none-any.whl.metadata (4.5 kB)
4:28:32 AM: Collecting distlib<1,>=0.3.7 (from virtualenv>=20.24.2->pipenv)
4:28:32 AM: Downloading distlib-0.3.9-py2.py3-none-any.whl.metadata (5.2 kB)
4:28:32 AM: Collecting filelock<4,>=3.12.2 (from virtualenv>=20.24.2->pipenv)
4:28:32 AM: Downloading filelock-3.16.1-py3-none-any.whl.metadata (2.9 kB)
4:28:32 AM: Collecting platformdirs<5,>=3.9.1 (from virtualenv>=20.24.2->pipenv)
4:28:32 AM: Downloading platformdirs-4.3.6-py3-none-any.whl.metadata (11 kB)
4:28:32 AM: Downloading pipenv-2024.4.0-py3-none-any.whl (3.0 MB)
4:28:32 AM: ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 3.0/3.0 MB 48.5 MB/s eta 0:00:00
4:28:32 AM: Downloading packaging-24.2-py3-none-any.whl (65 kB)
4:28:33 AM: Downloading setuptools-75.8.0-py3-none-any.whl (1.2 MB)
4:28:33 AM: ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.2/1.2 MB 54.2 MB/s eta 0:00:00
4:28:33 AM: Downloading virtualenv-20.28.1-py3-none-any.whl (4.3 MB)
4:28:33 AM: ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 4.3/4.3 MB 120.9 MB/s eta 0:00:00
4:28:33 AM: Downloading certifi-2024.12.14-py3-none-any.whl (164 kB)
4:28:33 AM: Downloading distlib-0.3.9-py2.py3-none-any.whl (468 kB)
4:28:33 AM: Downloading filelock-3.16.1-py3-none-any.whl (16 kB)
4:28:33 AM: Downloading platformdirs-4.3.6-py3-none-any.whl (18 kB)
4:28:33 AM: Installing collected packages: distlib, setuptools, platformdirs, packaging, filelock, certifi, virtualenv, pipenv
4:28:35 AM: Successfully installed certifi-2024.12.14 distlib-0.3.9 filelock-3.16.1 packaging-24.2 pipenv-2024.4.0 platformdirs-4.3.6 setuptools-75.8.0 virtualenv-20.28.1
4:28:36 AM: Attempting Ruby version 2.6.2, read from environment
4:28:36 AM: Required ruby-2.6.2 is not installed - installing.
4:28:36 AM: Searching for binary rubies, this might take some time.
4:28:36 AM: Checking requirements for ubuntu.
4:28:37 AM: Requirements installation successful.
4:28:37 AM: ruby-2.6.2 - #configure
4:28:37 AM: ruby-2.6.2 - #download
4:28:37 AM: ruby-2.6.2 - #validate archive
4:28:41 AM: ruby-2.6.2 - #extract
4:28:42 AM: ruby-2.6.2 - #validate binary
4:28:43 AM: ruby-2.6.2 - #setup
4:28:43 AM: ruby-2.6.2 - #gemset created /opt/buildhome/.rvm/gems/ruby-2.6.2@global
4:28:43 AM: ruby-2.6.2 - #importing gemset /opt/buildhome/.rvm/gemsets/global.gems........................................
4:28:44 AM: ruby-2.6.2 - #generating global wrappers........
4:28:44 AM: ruby-2.6.2 - #gemset created /opt/buildhome/.rvm/gems/ruby-2.6.2
4:28:44 AM: ruby-2.6.2 - #importing gemsetfile /opt/buildhome/.rvm/gemsets/default.gems evaluated to empty gem list
4:28:44 AM: ruby-2.6.2 - #generating default wrappers........
4:28:44 AM: Using /opt/buildhome/.rvm/gems/ruby-2.6.2
4:28:45 AM: Using Ruby version 2.6.2
4:28:45 AM: Started restoring cached go cache
4:28:45 AM: Finished restoring cached go cache
4:28:45 AM: Installing Go version 1.12 (requested 1.12)
4:28:49 AM: go version go1.12 linux/amd64
4:28:50 AM: Using PHP version 8.0
4:28:52 AM: Started restoring cached Node.js version
4:28:53 AM: Finished restoring cached Node.js version
4:28:54 AM: v14.3.0 is already installed.
4:28:54 AM: Now using node v14.3.0 (npm v6.14.5)
4:28:54 AM: Started restoring cached build plugins
4:28:54 AM: Finished restoring cached build plugins
4:28:54 AM: Successfully installed dependencies
4:28:54 AM: Starting build script
4:28:56 AM: Detected 1 framework(s)
4:28:56 AM: "cecil" at version "unknown"
4:28:56 AM: Section completed: initializing

Building

Complete
4:28:58 AM: Netlify Build
4:28:58 AM: ────────────────────────────────────────────────────────────────
4:28:58 AM:
4:28:58 AM: ❯ Version
4:28:58 AM: @netlify/build 29.58.2
4:28:58 AM:
4:28:58 AM: ❯ Flags
4:28:58 AM: accountId: 5a55185e8198766884f04205
4:28:58 AM: baseRelDir: false
4:28:58 AM: buildId: 6781f34351aaf00008d98836
4:28:58 AM: deployId: 6781f34351aaf00008d98838
4:28:58 AM:
4:28:58 AM: ❯ Current directory
4:28:58 AM: /opt/build/repo/netlify
4:28:58 AM:
4:28:58 AM: ❯ Config file
4:28:58 AM: /opt/build/repo/netlify.toml
4:28:58 AM:
4:28:58 AM: ❯ Context
4:28:58 AM: production
4:28:58 AM:
4:28:58 AM: build.command from netlify.toml
4:28:58 AM: ────────────────────────────────────────────────────────────────
4:28:58 AM: ​
4:28:58 AM: $ (wget --no-check-certificate --no-cache https://raw.githubusercontent.com/cncf/landscapeapp/master/netlify/landscape.js) && node landscape.js
4:28:58 AM: Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 185.199.108.133, 185.199.110.133, 185.199.109.133, ...
4:28:58 AM: Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|185.199.108.133|:443... connected.
4:28:58 AM: HTTP request sent, awaiting response... 200 OK
4:28:58 AM: Length: 8750 (8.5K) [text/plain]
4:28:58 AM: Saving to: ‘landscape.js’
4:28:58 AM: 0K ........ 100% 152M=0s
4:28:58 AM: 2025-01-11 04:28:58 (152 MB/s) - ‘landscape.js’ saved [8750/8750]
4:28:58 AM: We have a secret: c8***75
4:28:58 AM: We have a secret: 8G***pb
4:28:58 AM: We have a secret: 87***eb
4:28:58 AM: We have a secret: gh***7r
4:28:58 AM: starting /opt/build/repo/netlify
4:28:58 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:28:58 AM: Warning: Permanently added '147.75.199.15' (ECDSA) to the list of known hosts.
4:28:59 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:28:59 AM: * Documentation: https://help.ubuntu.com
4:28:59 AM: * Management: https://landscape.canonical.com
4:28:59 AM: * Support: https://ubuntu.com/advantage
4:28:59 AM: System information as of Sat Jan 11 04:28:58 UTC 2025
4:28:59 AM: System load: 0.18
4:28:59 AM: Usage of /: 38.8% of 217.51GB
4:28:59 AM: Memory usage: 19%
4:28:59 AM: Swap usage: 0%
4:28:59 AM: Processes: 621
4:28:59 AM: Users logged in: 1
4:28:59 AM: IPv4 address for bond0: 147.75.199.15
4:28:59 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:28:59 AM: IPv4 address for docker0: 172.17.0.1
4:28:59 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:28:59 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:28:59 AM: 86 updates can be applied immediately.
4:28:59 AM: 2 of these updates are standard security updates.
4:28:59 AM: To see these additional updates run: apt list --upgradable
4:28:59 AM: New release '22.04.5 LTS' available.
4:28:59 AM: Run 'do-release-upgrade' to upgrade to it.
4:28:59 AM: 2 updates could not be installed automatically. For more details,
4:28:59 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:28:59 AM: *** System restart required ***
4:28:59 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:28:59 AM: Warning: Permanently added '147.75.199.15' (ECDSA) to the list of known hosts.
4:28:59 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:28:59 AM: * Documentation: https://help.ubuntu.com
4:28:59 AM: * Management: https://landscape.canonical.com
4:28:59 AM: * Support: https://ubuntu.com/advantage
4:28:59 AM: System information as of Sat Jan 11 04:28:58 UTC 2025
4:28:59 AM: System load: 0.18
4:28:59 AM: Usage of /: 38.8% of 217.51GB
4:28:59 AM: Memory usage: 19%
4:28:59 AM: Swap usage: 0%
4:28:59 AM: Processes: 621
4:28:59 AM: Users logged in: 1
4:28:59 AM: IPv4 address for bond0: 147.75.199.15
4:28:59 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:28:59 AM: IPv4 address for docker0: 172.17.0.1
4:28:59 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:28:59 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:28:59 AM: 86 updates can be applied immediately.
4:28:59 AM: 2 of these updates are standard security updates.
4:28:59 AM: To see these additional updates run: apt list --upgradable
4:28:59 AM: New release '22.04.5 LTS' available.
4:28:59 AM: Run 'do-release-upgrade' to upgrade to it.
4:28:59 AM: 2 updates could not be installed automatically. For more details,
4:28:59 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:28:59 AM: *** System restart required ***
4:28:59 AM: Cloning into 'packageRemote'...
4:28:59 AM: node version: v18.3
4:28:59 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:29:00 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:29:00 AM: * Documentation: https://help.ubuntu.com
4:29:00 AM: * Management: https://landscape.canonical.com
4:29:00 AM: * Support: https://ubuntu.com/advantage
4:29:00 AM: System information as of Sat Jan 11 04:28:59 UTC 2025
4:29:00 AM: System load: 0.18
4:29:00 AM: Usage of /: 38.8% of 217.51GB
4:29:00 AM: Memory usage: 19%
4:29:00 AM: Swap usage: 0%
4:29:00 AM: Processes: 626
4:29:00 AM: Users logged in: 1
4:29:00 AM: IPv4 address for bond0: 147.75.199.15
4:29:00 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:29:00 AM: IPv4 address for docker0: 172.17.0.1
4:29:00 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:29:00 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:29:00 AM: 86 updates can be applied immediately.
4:29:00 AM: 2 of these updates are standard security updates.
4:29:00 AM: To see these additional updates run: apt list --upgradable
4:29:00 AM: New release '22.04.5 LTS' available.
4:29:00 AM: Run 'do-release-upgrade' to upgrade to it.
4:29:00 AM: 2 updates could not be installed automatically. For more details,
4:29:00 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:29:00 AM: *** System restart required ***
4:29:00 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:29:00 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:29:00 AM: * Documentation: https://help.ubuntu.com
4:29:00 AM: * Management: https://landscape.canonical.com
4:29:00 AM: * Support: https://ubuntu.com/advantage
4:29:00 AM: System information as of Sat Jan 11 04:28:59 UTC 2025
4:29:00 AM: System load: 0.18
4:29:00 AM: Usage of /: 38.8% of 217.51GB
4:29:00 AM: Memory usage: 19%
4:29:00 AM: Swap usage: 0%
4:29:00 AM: Processes: 626
4:29:00 AM: Users logged in: 1
4:29:00 AM: IPv4 address for bond0: 147.75.199.15
4:29:00 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:29:00 AM: IPv4 address for docker0: 172.17.0.1
4:29:00 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:29:00 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:29:00 AM: 86 updates can be applied immediately.
4:29:00 AM: 2 of these updates are standard security updates.
4:29:00 AM: To see these additional updates run: apt list --upgradable
4:29:00 AM: New release '22.04.5 LTS' available.
4:29:00 AM: Run 'do-release-upgrade' to upgrade to it.
4:29:00 AM: 2 updates could not be installed automatically. For more details,
4:29:00 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:29:00 AM: *** System restart required ***
4:29:00 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:29:01 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:29:01 AM: * Documentation: https://help.ubuntu.com
4:29:01 AM: * Management: https://landscape.canonical.com
4:29:01 AM: * Support: https://ubuntu.com/advantage
4:29:01 AM: System information as of Sat Jan 11 04:29:00 UTC 2025
4:29:01 AM: System load: 0.24
4:29:01 AM: Usage of /: 38.8% of 217.51GB
4:29:01 AM: Memory usage: 19%
4:29:01 AM: Swap usage: 0%
4:29:01 AM: Processes: 625
4:29:01 AM: Users logged in: 1
4:29:01 AM: IPv4 address for bond0: 147.75.199.15
4:29:01 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:29:01 AM: IPv4 address for docker0: 172.17.0.1
4:29:01 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:29:01 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:29:01 AM: 86 updates can be applied immediately.
4:29:01 AM: 2 of these updates are standard security updates.
4:29:01 AM: To see these additional updates run: apt list --upgradable
4:29:01 AM: New release '22.04.5 LTS' available.
4:29:01 AM: Run 'do-release-upgrade' to upgrade to it.
4:29:01 AM: 2 updates could not be installed automatically. For more details,
4:29:01 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:29:01 AM: *** System restart required ***
4:29:01 AM: focal: Pulling from netlify/build
4:29:01 AM: Digest: sha256:12e8d8fa46c18c8b45046890c49b7b171056b9920de51c0458a0662e57ebf2b1
4:29:01 AM: Status: Image is up to date for netlify/build:focal
4:29:01 AM: docker.io/netlify/build:focal
4:29:01 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:29:01 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:29:01 AM: * Documentation: https://help.ubuntu.com
4:29:01 AM: * Management: https://landscape.canonical.com
4:29:01 AM: * Support: https://ubuntu.com/advantage
4:29:01 AM: System information as of Sat Jan 11 04:29:00 UTC 2025
4:29:01 AM: System load: 0.24
4:29:01 AM: Usage of /: 38.8% of 217.51GB
4:29:01 AM: Memory usage: 19%
4:29:01 AM: Swap usage: 0%
4:29:01 AM: Processes: 625
4:29:01 AM: Users logged in: 1
4:29:01 AM: IPv4 address for bond0: 147.75.199.15
4:29:01 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:29:01 AM: IPv4 address for docker0: 172.17.0.1
4:29:01 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:29:01 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:29:01 AM: 86 updates can be applied immediately.
4:29:01 AM: 2 of these updates are standard security updates.
4:29:01 AM: To see these additional updates run: apt list --upgradable
4:29:01 AM: New release '22.04.5 LTS' available.
4:29:01 AM: Run 'do-release-upgrade' to upgrade to it.
4:29:01 AM: 2 updates could not be installed automatically. For more details,
4:29:01 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:29:01 AM: *** System restart required ***
4:29:01 AM: focal: Pulling from netlify/build
4:29:01 AM: Digest: sha256:12e8d8fa46c18c8b45046890c49b7b171056b9920de51c0458a0662e57ebf2b1
4:29:01 AM: Status: Image is up to date for netlify/build:focal
4:29:01 AM: docker.io/netlify/build:focal
4:29:04 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:29:04 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:29:04 AM: * Documentation: https://help.ubuntu.com
4:29:04 AM: * Management: https://landscape.canonical.com
4:29:04 AM: * Support: https://ubuntu.com/advantage
4:29:04 AM: System information as of Sat Jan 11 04:29:04 UTC 2025
4:29:04 AM: System load: 0.24
4:29:04 AM: Usage of /: 38.9% of 217.51GB
4:29:04 AM: Memory usage: 19%
4:29:04 AM: Swap usage: 0%
4:29:04 AM: Processes: 632
4:29:04 AM: Users logged in: 1
4:29:04 AM: IPv4 address for bond0: 147.75.199.15
4:29:04 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:29:04 AM: IPv4 address for docker0: 172.17.0.1
4:29:04 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:29:04 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:29:04 AM: 86 updates can be applied immediately.
4:29:04 AM: 2 of these updates are standard security updates.
4:29:04 AM: To see these additional updates run: apt list --upgradable
4:29:04 AM: New release '22.04.5 LTS' available.
4:29:04 AM: Run 'do-release-upgrade' to upgrade to it.
4:29:04 AM: 2 updates could not be installed automatically. For more details,
4:29:04 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:29:04 AM: *** System restart required ***
4:29:04 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:29:04 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:29:04 AM: * Documentation: https://help.ubuntu.com
4:29:04 AM: * Management: https://landscape.canonical.com
4:29:04 AM: * Support: https://ubuntu.com/advantage
4:29:04 AM: System information as of Sat Jan 11 04:29:04 UTC 2025
4:29:04 AM: System load: 0.24
4:29:04 AM: Usage of /: 38.9% of 217.51GB
4:29:04 AM: Memory usage: 19%
4:29:04 AM: Swap usage: 0%
4:29:04 AM: Processes: 632
4:29:04 AM: Users logged in: 1
4:29:04 AM: IPv4 address for bond0: 147.75.199.15
4:29:04 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:29:04 AM: IPv4 address for docker0: 172.17.0.1
4:29:04 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:29:04 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:29:04 AM: 86 updates can be applied immediately.
4:29:04 AM: 2 of these updates are standard security updates.
4:29:04 AM: To see these additional updates run: apt list --upgradable
4:29:04 AM: New release '22.04.5 LTS' available.
4:29:04 AM: Run 'do-release-upgrade' to upgrade to it.
4:29:04 AM: 2 updates could not be installed automatically. For more details,
4:29:04 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:29:04 AM: *** System restart required ***
4:29:05 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:29:05 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:29:05 AM: * Documentation: https://help.ubuntu.com
4:29:05 AM: * Management: https://landscape.canonical.com
4:29:05 AM: * Support: https://ubuntu.com/advantage
4:29:05 AM: System information as of Sat Jan 11 04:29:05 UTC 2025
4:29:05 AM: System load: 0.24
4:29:05 AM: Usage of /: 38.9% of 217.51GB
4:29:05 AM: Memory usage: 19%
4:29:05 AM: Swap usage: 0%
4:29:05 AM: Processes: 636
4:29:05 AM: Users logged in: 1
4:29:05 AM: IPv4 address for bond0: 147.75.199.15
4:29:05 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:29:05 AM: IPv4 address for docker0: 172.17.0.1
4:29:05 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:29:05 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:29:05 AM: 86 updates can be applied immediately.
4:29:05 AM: 2 of these updates are standard security updates.
4:29:05 AM: To see these additional updates run: apt list --upgradable
4:29:05 AM: New release '22.04.5 LTS' available.
4:29:05 AM: Run 'do-release-upgrade' to upgrade to it.
4:29:05 AM: 2 updates could not be installed automatically. For more details,
4:29:05 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:29:05 AM: *** System restart required ***
4:29:06 AM: /opt/buildhome/.nvm/nvm.sh
4:29:06 AM: .:
4:29:06 AM: ADOPTERS.md jest.config.js netlify.toml tools
4:29:06 AM: bin landscapes_dev package.json update_server
4:29:06 AM: build.sh landscapes.sh postcss.config.js _yarn.lock
4:29:06 AM: code-of-conduct.md landscapes.yml README.md yarn.lock
4:29:06 AM: files LICENSE server.js
4:29:06 AM: _headers netlify specs
4:29:06 AM: INSTALL.md netlify.md src
4:29:06 AM: v18.3
4:29:06 AM: Downloading and installing node v18.3.0...
4:29:07 AM: Computing checksum with sha256sum
4:29:07 AM: Checksums matched!
4:29:10 AM: Now using node v18.3.0 (npm v8.11.0)
4:29:10 AM: Now using node v18.3.0 (npm v8.11.0)
4:29:11 AM: npm
4:29:11 AM: WARN config global `--global`, `--local` are deprecated. Use `--location=global` instead.
4:29:11 AM:
4:29:11 AM: added 3 packages, and audited 4 packages in 487ms
4:29:11 AM: found 0 vulnerabilities
4:29:12 AM: npm
4:29:12 AM: WARN config global `--global`, `--local` are deprecated. Use `--location=global` instead.
4:29:12 AM:
4:29:14 AM: removed 12 packages, changed 105 packages, and audited 263 packages in 2s
4:29:14 AM: 27 packages are looking for funding
4:29:14 AM: run `npm fund` for details
4:29:14 AM: found 0 vulnerabilities
4:29:15 AM: added 1 package in 998ms
4:29:16 AM: YN0000: ┌ Resolution step
4:29:16 AM: YN0000: └ Completed
4:29:16 AM: YN0000: ┌ Fetch step
4:29:22 AM: YN0013: │ 2 packages were already cached, 808 had to be fetched
4:29:22 AM: YN0000: └ Completed in 5s 830ms
4:29:22 AM: YN0000: ┌ Link step
4:29:23 AM: YN0000: │ ESM support for PnP uses the experimental loader API and is therefore experimental
4:29:23 AM: YN0007: │ puppeteer@npm:14.2.1 must be built because it never has been before or the last one failed
4:29:23 AM: YN0007: │ puppeteer@npm:13.2.0 must be built because it never has been before or the last one failed
4:29:23 AM: YN0007: │ yarn@npm:1.22.18 must be built because it never has been before or the last one failed
4:29:24 AM: YN0000: │ puppeteer@npm:14.2.1 STDERR
4:29:25 AM: YN0000: │ puppeteer@npm:13.2.0 STDERR
4:29:28 AM: YN0000: │ puppeteer@npm:13.2.0 STDOUT Chromium (961656) downloaded to /opt/repo/packageRemote/.yarn/unplugged/puppeteer-npm-13.2.0-5595d43df8/node_modules/puppeteer/.local-chromium/linux-961656
4:29:29 AM: YN0000: │ puppeteer@npm:14.2.1 STDOUT Chromium (1002410) downloaded to /opt/repo/packageRemote/.yarn/unplugged/puppeteer-npm-14.2.1-e2757bbf6c/node_modules/puppeteer/.local-chromium/linux-1002410
4:29:29 AM: YN0000: └ Completed in 6s 513ms
4:29:29 AM: YN0000: Done with warnings in 12s 750ms
4:29:31 AM: Processing the tree
4:29:33 AM: Running with a level=easy. Settings:
4:29:33 AM: Use cached crunchbase data: true
4:29:33 AM: Use cached images data: true
4:29:33 AM: Use cached twitter data: true
4:29:33 AM: Use cached github basic stats: true
4:29:33 AM: Use cached github start dates: true
4:29:33 AM: Use cached best practices: true
4:29:34 AM: Fetching crunchbase entries
4:29:34 AM: ................................................................................
4:29:34 AM: ................................................................................
4:29:34 AM: ................................................................................
4:29:34 AM: ................................................................................
4:29:34 AM: ................................................................................
4:29:34 AM: ........................................................*
4:29:34 AM: Fetching github entries
4:29:41 AM: ................................................................................
4:29:41 AM: ................................................................................
4:29:41 AM: ..................................*********************.........................
4:29:41 AM: ................................................................................
4:29:41 AM: ....................................................................*********
4:29:41 AM: Fetching start date entries
4:29:44 AM: ................................................................................
4:29:44 AM: ................................................................................
4:29:44 AM: ............................................***********.........................
4:29:44 AM: ................................................................................
4:29:44 AM: ..........................................................*******************
4:29:44 AM: Fetching images
4:29:45 AM: got image entries
4:29:45 AM: Hash for Prefect is prefect-2
4:29:50 AM: ................................................................................
4:29:50 AM: ............**....*.............................................................
4:29:50 AM: ................................................................................
4:29:50 AM: ................................................................................
4:29:50 AM: ................................................................................
4:29:50 AM: ................................................................................
4:29:50 AM: ..
4:29:50 AM: Fetching last tweet dates
4:29:50 AM: Fetching best practices
4:29:50 AM: ................................................................................
4:29:50 AM: ................................................................................
4:29:50 AM: ................................................................................
4:29:50 AM: ................................................................................
4:29:50 AM: ................................................
4:29:50 AM: Fetching CLOMonitor data
4:29:50 AM: Processing the tree
4:29:50 AM: saving!
4:29:53 AM: Hash for Prefect is prefect-2
4:29:53 AM: Warning: Open Platform for Enterprise AI (OPEA) has a twitter https://twitter.com/opeadev which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: Dragonfly has a twitter https://twitter.com/dragonfly_oss which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: Vald has a twitter https://twitter.com/vdaas_vald which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: FastTrackML has a twitter https://twitter.com/oss_gr which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: Mathesar has a twitter https://twitter.com/centerofci which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: RWKV has a twitter https://twitter.com/RWKV_AI which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: Langchain has a twitter https://twitter.com/LangChainAI which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: Haystack has a twitter https://twitter.com/deepset_ai which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: Armada has a twitter https://twitter.com/oss_gr which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: envd has a twitter https://twitter.com/TensorChord which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: NVIDIA Corporation has a twitter https://twitter.com/nvidia which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: Advanced Micro Devices (AMD) has a twitter https://twitter.com/AMD which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: Alibaba Cloud (Singapore) Private LTD has a twitter https://twitter.com/AlibabaB2B which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: Douyin Vision Co., Ltd. has a twitter https://twitter.com/BytedanceTalk which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: Fujitsu Limited has a twitter https://twitter.com/Fujitsu_Global which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: GitCode has a twitter https://twitter.com/turingbook which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: Neo4j, Inc. has a twitter https://twitter.com/neo4j which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: New Relic, Inc. has a twitter https://twitter.com/newrelic which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: Posit has a twitter https://twitter.com/posit_pbc which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: AI for People has a twitter https://twitter.com/AIforPeople which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: Ambianic.ai has a twitter https://twitter.com/ambianicai which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: Columbia University has a twitter https://twitter.com/Columbia which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: Common Crawl Foundation has a twitter https://twitter.com/commoncrawl which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: Ersilia Open Source Initiative has a twitter https://twitter.com/ersiliaio which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: Ewha Institute for Biomedical Law & Ethics has a twitter https://twitter.com/cybercampus which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: Montreal AI Ethics Institute has a twitter https://twitter.com/mtlaiethics which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: NIPA has a twitter https://twitter.com/NIPAkr which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: One Fact Foundation has a twitter https://twitter.com/onefact_org which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: Pranveer Singh Institute Of Technology has a twitter https://twitter.com/PSITKanpur2004 which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: Rensselaer Polytechnic Institute has a twitter https://twitter.com/rpi which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: Transport for NSW has a twitter https://twitter.com/TransportforNSW which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: University of Michigan has a twitter https://twitter.com/UMich which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: Wikimedia Deutschland e. V. has a twitter https://twitter.com/WikimediaDE which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: World Ethical Data Foundation has a twitter https://twitter.com/WEDF_foundation which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: BasicAI(hosting) has a twitter https://twitter.com/BasicAIteam which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: Fujitsu Limited(hosting) has a twitter https://twitter.com/Fujitsu_Global which is invalid or we just can not fetch its tweets
4:29:53 AM: Warning: Owkin(hosting) has a twitter https://twitter.com/owkinscience which is invalid or we just can not fetch its tweets
4:29:54 AM: Fetching members from LF AI & Data Member Company category
4:29:54 AM: Processing the tree
4:29:54 AM: Assigning Premier membership on DataHub (LinkedIn) because its parent Microsoft has Premier membership
4:29:54 AM: Assigning Premier membership on Brooklin (LinkedIn) because its parent Microsoft has Premier membership
4:29:54 AM: Assigning Premier membership on Azkaban (LinkedIn) because its parent Microsoft has Premier membership
4:29:54 AM: Assigning Premier membership on LinkedIn (hosting) (LinkedIn) because its parent Microsoft has Premier membership
4:29:54 AM: Hash for Chaitanya Bharathi Institute of Technology is chaitanya-bharathi-institute-of-technology-2
4:29:54 AM: Hash for Fast.ai is fast-ai-2
4:29:54 AM: Hash for Great Expectations is great-expectations-2
4:29:54 AM: Hash for ML Perf is ml-perf-2
4:29:54 AM: Hash for PipelineAI is pipeline-ai-2
4:29:54 AM: Hash for Prefect is prefect-2
4:29:54 AM: Hash for Redash is redash-2
4:29:54 AM: Hash for Wikimedia Deutschland e.V. is wikimedia-deutschland-e-v-2
4:29:57 AM: {
4:29:57 AM: name: 'Accord.NET',
4:29:57 AM: homepage_url: 'http://accord-framework.net/',
4:29:57 AM: logo: 'accord-net.svg',
4:29:57 AM: github_data: {
4:29:57 AM: languages: [
4:29:57 AM: [Object], [Object],
4:29:57 AM: [Object], [Object],
4:29:57 AM: [Object], [Object],
4:29:57 AM: [Object], [Object],
4:29:57 AM: [Object], [Object],
4:29:57 AM: [Object], [Object],
4:29:57 AM: [Object], [Object],
4:29:57 AM: [Object]
4:29:57 AM: ],
4:29:57 AM: contributions: '0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0',
4:29:57 AM: firstWeek: '2022-11-27Z',
4:29:57 AM: stars: 4404,
4:29:57 AM: license: 'GNU Lesser General Public License v2.1',
4:29:57 AM: description: 'Machine learning, computer vision, statistics and general scientific computing for .NET',
4:29:57 AM: latest_commit_date: '2020-11-18T19:53:01Z',
4:29:57 AM: latest_commit_link: '/accord-net/framework/commit/1ab0cc0ba55bcc3d46f20e7bbe7224b58cd01854',
4:29:57 AM: release_date: '2017-10-19T21:00:56Z',
4:29:57 AM: contributors_count: 98,
4:29:57 AM: },
4:29:57 AM: repos: [ { url: 'https://github.com/accord-net/framework', stars: 4404 } ],
4:29:57 AM: github_start_commit_data: {
4:29:57 AM: start_commit_link: '/accord-net/framework/commit/5e834efc89c071bb32c84f41124d5c9a5bbfe1e3',
4:29:57 AM: start_date: '2012-04-08T14:05:58Z'
4:29:57 AM: },
4:29:57 AM: image_data: {
4:29:57 AM: fileName: 'accord-net.svg',
4:29:57 AM: hash: 'tdvFWfo6hfZQqHqjscAtcoZWk6qztZFzBTVqKWUjYig='
4:29:57 AM: },
4:29:57 AM: firstCommitDate: '2012-04-08T14:05:58Z',
4:29:57 AM: latestCommitDate: '2020-11-18T19:53:01Z',
4:29:57 AM: releaseDate: '2017-10-19T21:00:56Z',
4:29:57 AM: commitsThisYear: 0,
4:29:57 AM: contributorsCount: 98,
4:29:57 AM: language: 'C#',
4:29:57 AM: stars: 4404,
4:29:57 AM: license: 'GNU Lesser General Public License v2.1',
4:29:57 AM: headquarters: 'Grenoble, France',
4:29:57 AM: description: 'Machine learning, computer vision, statistics and general scientific computing for .NET',
4:29:57 AM: organization: 'Accord.NET Framework',
4:29:57 AM: crunchbaseData: {
4:29:57 AM: name: 'Accord.NET Framework',
4:29:57 AM: description: 'Machine Learning Framework',
4:29:57 AM: homepage: 'http://accord-framework.net/',
4:29:57 AM: city: 'Grenoble',
4:29:57 AM: region: 'Rhone-Alpes',
4:29:57 AM: country: 'France',
4:29:57 AM: twitter: null,
4:29:57 AM: linkedin: null,
4:29:57 AM: acquisitions: [],
4:29:57 AM: parents: [],
4:29:57 AM: stockExchange: null,
4:29:57 AM: company_type: 'Non Profit',
4:29:57 AM: industries: [
4:29:57 AM: 'Analytics',
4:29:57 AM: 'Artificial Intelligence',
4:29:57 AM: 'Hardware',
4:29:57 AM: 'Machine Learning'
4:29:57 AM: ],
4:29:57 AM: numEmployeesMin: null,
4:29:57 AM: numEmployeesMax: null
4:29:57 AM: },
4:29:57 AM: path: 'Machine Learning / Framework',
4:29:57 AM: landscape: 'Machine Learning / Framework',
4:29:57 AM: category: 'Machine Learning',
4:29:57 AM: amount: 'N/A',
4:29:57 AM: oss: true,
4:29:57 AM: href: 'logos/accord-net.svg',
4:29:57 AM: bestPracticeBadgeId: false,
4:29:57 AM: bestPracticePercentage: null,
4:29:57 AM: industries: [
4:29:57 AM: 'Analytics',
4:29:57 AM: 'Artificial Intelligence',
4:29:57 AM: 'Hardware',
4:29:57 AM: 'Machine Learning'
4:29:57 AM: ],
4:29:57 AM: starsPresent: true,
4:29:57 AM: starsAsText: '4,404',
4:29:57 AM: marketCapPresent: false,
4:29:57 AM: marketCapAsText: 'N/A',
4:29:57 AM: id: 'accord-net',
4:29:57 AM: flatName: 'Accord.NET',
4:29:57 AM: member: false,
4:29:57 AM: relation: false,
4:29:57 AM: isSubsidiaryProject: false
4:29:57 AM: } 2020-11-18T19:53:01Z
4:29:57 AM: [
4:29:57 AM: 'Community Data License Agreement (CDLA)',
4:29:57 AM: 'PlaNet',
4:29:57 AM: 'Generic Neural Elastic Search (GNES)',
4:29:57 AM: 'PredictionIO',
4:29:57 AM: 'ELI5',
4:29:57 AM: 'BERT',
4:29:57 AM: 'Nauta',
4:29:57 AM: 'DAWNBench',
4:29:57 AM: 'AresDB',
4:29:57 AM: 'dotmesh',
4:29:57 AM: 'Audit AI',
4:29:57 AM: 'euler',
4:29:57 AM: 'Clipper',
4:29:57 AM: 'Accord.NET',
4:29:57 AM: 'Shogun',
4:29:57 AM: 'DELTA',
4:29:57 AM: 'BeakerX',
4:29:57 AM: 'PixieDust',
4:29:57 AM: 'TreeInterpreter',
4:29:57 AM: 'Cyclone',
4:29:57 AM: 'Lucid',
4:29:57 AM: 'XLM',
4:29:57 AM: 'Chainer RL',
4:29:57 AM: 'ForestFlow',
4:29:57 AM: 'uReplicator',
4:29:57 AM: 'Elastic Deep Learning (EDL)',
4:29:57 AM: 'Kashgari',
4:29:57 AM: 'DataPractices',
4:29:57 AM: 'X-DeepLearning',
4:29:57 AM: 'LIME',
4:29:57 AM: 'Model Asset eXchange (MAX)',
4:29:57 AM: 'TransmogrifAI',
4:29:57 AM: 'OpenBytes',
4:29:57 AM: 'DeepLIFT',
4:29:57 AM: 'Onepanel',
4:29:57 AM: 'DeepSpeech',
4:29:57 AM: 'Lucene',
4:29:57 AM: 'Turi Create',
4:29:57 AM: 'Visual Object Tagging Tool (VoTT)',
4:29:57 AM: 'Acumos',
4:29:57 AM: 'Skater',
4:29:57 AM: 'Catalyst',
4:29:57 AM: 'SKIP Language',
4:29:57 AM: 'SQLFlow',
4:29:57 AM: 'Advertorch',
4:29:57 AM: 'xLearn',
4:29:57 AM: 'Neuropod',
4:29:57 AM: 'AdvBox',
4:29:57 AM: 'RCloud',
4:29:57 AM: 'Neo-AI',
4:29:57 AM: 'Embedded Learning Library',
4:29:57 AM: 'Stable Baselines',
4:29:57 AM: 'talos',
4:29:57 AM: 'LabelImg',
4:29:57 AM: 'MMdnn',
4:29:57 AM: 'CNTK',
4:29:57 AM: 'Machine Learning eXchange',
4:29:57 AM: 'Singularity',
4:29:57 AM: 'Chainer',
4:29:57 AM: 'PyText',
4:29:57 AM: 'Pipeline.ai',
4:29:57 AM: 'Apache Bahir',
4:29:57 AM: 'NLP Architect',
4:29:57 AM: 'AllenNLP',
4:29:57 AM: 'Angel-ML',
4:29:57 AM: 'SEED RL',
4:29:57 AM: 'Coach',
4:29:57 AM: 'Gluon-NLP',
4:29:57 AM: 'DeepMind Lab',
4:29:57 AM: 'SEAL',
4:29:57 AM: 'MXNet',
4:29:57 AM: 'OpenAI Gym',
4:29:57 AM: 'MindMeld',
4:29:57 AM: 'CleverHans',
4:29:57 AM: 'Petastorm',
4:29:57 AM: 'Hawq',
4:29:57 AM: 'TF Encrypted',
4:29:57 AM: 'faust',
4:29:57 AM: 'Cortex',
4:29:57 AM: 'OpenDataology',
4:29:57 AM: 'YouTokenToMe',
4:29:57 AM: 'ALBERT',
4:29:57 AM: 'Adlik',
4:29:57 AM: '1chipML',
4:29:57 AM: 'Neural Network Distiller',
4:29:57 AM: 'Labelbox',
4:29:57 AM: 'Facets',
4:29:57 AM: 'OpenNN',
4:29:57 AM: 'Pilosa',
4:29:57 AM: 'Orchest',
4:29:57 AM: 'Model Server for Apache MXNet',
4:29:57 AM: 'LASER',
4:29:57 AM: 'Dopamine',
4:29:57 AM: 'MindSpore',
4:29:57 AM: 'HE Lib',
4:29:57 AM: 'd6tflow',
4:29:57 AM: 'Sonnet',
4:29:57 AM: 'Plaid ML',
4:29:57 AM: 'Nyoka',
4:29:57 AM: 'doccano',
4:29:57 AM: ... 253 more items
4:29:57 AM: ]
4:30:01 AM: ncc: Version 0.34.0
4:30:01 AM: ncc: Compiling file index.js into CJS
4:30:02 AM: ncc: Version 0.34.0
4:30:02 AM: ncc: Compiling file index.js into CJS
4:30:03 AM: ncc: Version 0.34.0
4:30:03 AM: ncc: Compiling file index.js into CJS
4:30:07 AM: Development server running at http://127.0.0.1:4000/
4:30:17 AM: Running integration-test,check-landscape,render-landscape,funding in parallel
4:30:19 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:30:21 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:30:22 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:30:22 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:30:24 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:30:24 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:30:26 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:30:26 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:30:27 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:30:27 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:30:28 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:30:28 AM: api request starting... /api/ids?category=&project=&license=&organization=accord-net-framework&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:30:29 AM: api request starting... /api/ids?category=&project=&license=&organization=microsoft&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:30:39 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:31:00 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:31:20 AM: Task: integration-test PASS specs/main.spec.js (9.621s)
4:31:20 AM: Main test
4:31:20 AM: I visit a main page and have all required elements
4:31:20 AM: ✓ I can open a page (1508ms)
4:31:20 AM: ✓ A proper header is present (5ms)
4:31:20 AM: ✓ Group headers are ok (3ms)
4:31:20 AM: ✓ I see a You are viewing text (2ms)
4:31:20 AM: ✓ A proper card is present (3ms)
4:31:20 AM: ✓ If I click on a card, I see a modal dialog (327ms)
4:31:20 AM: ✓ Closing a browser (30ms)
4:31:20 AM: Landscape Test
4:31:20 AM: ✓ I visit LF AI & Data Members landscape page and have all required elements, elements are clickable (991ms)
4:31:20 AM: ✓ Closing a browser (24ms)
4:31:20 AM: ✓ I visit Companies Hosting Projects landscape page and have all required elements, elements are clickable (697ms)
4:31:20 AM: ✓ Closing a browser (19ms)
4:31:20 AM: I visit a main landscape page and have all required elements
4:31:20 AM: ✓ I open a landscape page and wait for it to load (1885ms)
4:31:20 AM: ✓ When I click on an item the modal is open (86ms)
4:31:20 AM: ✓ If I would straight open the url with a selected id, a modal appears (2048ms)
4:31:20 AM: ✓ Closing a browser (32ms)
4:31:20 AM: Filtering by organization
4:31:20 AM: ✓ Checking we see Accord.NET when filtering by organization Accord.NET Framework (607ms)
4:31:20 AM: ✓ Checking we don't see Accord.NET when filtering by organization Microsoft (285ms)
✓ Closing a browser (22ms)
4:31:20 AM: PASS specs/tools/actualTwitter.spec.js
4:31:20 AM: Twitter URL
4:31:20 AM: when crunchbase data not set
4:31:20 AM: ✓ returns URL from node (2ms)
4:31:20 AM: when node does not have twitter URL
4:31:20 AM: ✓ returns URL from node
4:31:20 AM: when node has twitter URL set to null
4:31:20 AM: ✓ returns undefined (1ms)
4:31:20 AM: when both node and crunchbase have twitter URL
4:31:20 AM: ✓ returns URL from node
4:31:20 AM: when twitter URL is not set anywhere
4:31:20 AM: ✓ returns undefined (1ms)
4:31:20 AM: cleaning up twitter URL
4:31:20 AM: ✓ replaces http with https
4:31:20 AM: ✓ removes www
4:31:20 AM: ✓ query string (1ms)
4:31:20 AM: Test Suites: 2 passed, 2 total
4:31:20 AM: Tests: 26 passed, 26 total
4:31:20 AM: Snapshots: 0 total
4:31:20 AM: Time: 9.855s
4:31:20 AM: Task: check-landscape
4:31:20 AM: Task: render-landscape visiting http://localhost:4000/fullscreen?version=2025-01-11T04:30:18Z 8df7a91&scale=false&pdf
4:31:20 AM: visiting http://localhost:4000/fullscreen?version=2025-01-11T04:30:18Z 8df7a91&scale=false&pdf
4:31:20 AM: Task: funding From https://github.com/lfai/lfai-landscape
4:31:20 AM: * [new branch] NSouthernLF-patch-1 -> github/NSouthernLF-patch-1
4:31:20 AM: * [new branch] NSouthernLF-patch-2 -> github/NSouthernLF-patch-2
4:31:20 AM: * [new branch] NSouthernLF-patch-3 -> github/NSouthernLF-patch-3
4:31:20 AM: * [new branch] NSouthernLF-patch-3-1 -> github/NSouthernLF-patch-3-1
4:31:20 AM: * [new branch] create-pull-request/patch-1693628094 -> github/create-pull-request/patch-1693628094
4:31:20 AM: * [new branch] create-pull-request/patch-1693714471 -> github/create-pull-request/patch-1693714471
4:31:20 AM: * [new branch] create-pull-request/patch-1703668428 -> github/create-pull-request/patch-1703668428
4:31:20 AM: * [new branch] main -> github/main
4:31:20 AM: * [new branch] revert-303-main -> github/revert-303-main
4:31:20 AM: * [new branch] web-landscape-2021-10-01T22-18 -> github/web-landscape-2021-10-01T22-18
4:31:20 AM: * [new branch] web-landscape-2021-10-12T16-06 -> github/web-landscape-2021-10-12T16-06
4:31:20 AM: * [new branch] web-landscape-2021-10-18T19-46 -> github/web-landscape-2021-10-18T19-46
4:31:20 AM: * [new branch] web-landscape-2021-10-20T18-58 -> github/web-landscape-2021-10-20T18-58
4:31:23 AM: Output from remote build, exit code: 0
4:31:23 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:31:23 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:31:23 AM: * Documentation: https://help.ubuntu.com
4:31:23 AM: * Management: https://landscape.canonical.com
4:31:23 AM: * Support: https://ubuntu.com/advantage
4:31:23 AM: System information as of Sat Jan 11 04:29:05 UTC 2025
4:31:23 AM: System load: 0.24
4:31:23 AM: Usage of /: 38.9% of 217.51GB
4:31:23 AM: Memory usage: 19%
4:31:23 AM: Swap usage: 0%
4:31:23 AM: Processes: 636
4:31:23 AM: Users logged in: 1
4:31:23 AM: IPv4 address for bond0: 147.75.199.15
4:31:23 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:31:23 AM: IPv4 address for docker0: 172.17.0.1
4:31:23 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:31:23 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:31:23 AM: 86 updates can be applied immediately.
4:31:23 AM: 2 of these updates are standard security updates.
4:31:23 AM: To see these additional updates run: apt list --upgradable
4:31:23 AM: New release '22.04.5 LTS' available.
4:31:23 AM: Run 'do-release-upgrade' to upgrade to it.
4:31:23 AM: 2 updates could not be installed automatically. For more details,
4:31:23 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:31:23 AM: *** System restart required ***
4:31:23 AM: /opt/buildhome/.nvm/nvm.sh
4:31:23 AM: .:
4:31:23 AM: ADOPTERS.md jest.config.js netlify.toml tools
4:31:23 AM: bin landscapes_dev package.json update_server
4:31:23 AM: build.sh landscapes.sh postcss.config.js _yarn.lock
4:31:23 AM: code-of-conduct.md landscapes.yml README.md yarn.lock
4:31:23 AM: files LICENSE server.js
4:31:23 AM: _headers netlify specs
4:31:23 AM: INSTALL.md netlify.md src
4:31:23 AM: v18.3
4:31:23 AM: Downloading and installing node v18.3.0...
4:31:23 AM: Computing checksum with sha256sum
4:31:23 AM: Checksums matched!
4:31:23 AM: Now using node v18.3.0 (npm v8.11.0)
4:31:23 AM: Now using node v18.3.0 (npm v8.11.0)
4:31:23 AM: npm WARN config global `--global`, `--local` are deprecated. Use `--location=global` instead.
4:31:23 AM:
4:31:23 AM: added 3 packages, and audited 4 packages in 487ms
4:31:23 AM: found 0 vulnerabilities
4:31:23 AM: npm WARN config global `--global`, `--local` are deprecated. Use `--location=global` instead.
4:31:23 AM:
4:31:23 AM: removed 12 packages, changed 105 packages, and audited 263 packages in 2s
4:31:23 AM: 27 packages are looking for funding
4:31:23 AM: run `npm fund` for details
4:31:23 AM: found 0 vulnerabilities
4:31:23 AM: added 1 package in 998ms
4:31:23 AM: YN0000: ┌ Resolution step
4:31:23 AM: YN0000: └ Completed
4:31:23 AM: YN0000: ┌ Fetch step
4:31:23 AM: YN0013: │ 2 packages were already cached, 808 had to be fetched
4:31:23 AM: YN0000: └ Completed in 5s 830ms
4:31:23 AM: YN0000: ┌ Link step
4:31:23 AM: YN0000: │ ESM support for PnP uses the experimental loader API and is therefore experimental
4:31:23 AM: YN0007: │ puppeteer@npm:14.2.1 must be built because it never has been before or the last one failed
4:31:23 AM: YN0007: │ puppeteer@npm:13.2.0 must be built because it never has been before or the last one failed
4:31:23 AM: YN0007: │ yarn@npm:1.22.18 must be built because it never has been before or the last one failed
4:31:23 AM: YN0000: │ puppeteer@npm:14.2.1 STDERR
4:31:23 AM: YN0000: │ puppeteer@npm:13.2.0 STDERR
4:31:23 AM: YN0000: │ puppeteer@npm:13.2.0 STDOUT Chromium (961656) downloaded to /opt/repo/packageRemote/.yarn/unplugged/puppeteer-npm-13.2.0-5595d43df8/node_modules/puppeteer/.local-chromium/linux-961656
4:31:23 AM: YN0000: │ puppeteer@npm:14.2.1 STDOUT Chromium (1002410) downloaded to /opt/repo/packageRemote/.yarn/unplugged/puppeteer-npm-14.2.1-e2757bbf6c/node_modules/puppeteer/.local-chromium/linux-1002410
4:31:23 AM: YN0000: └ Completed in 6s 513ms
4:31:23 AM: YN0000: Done with warnings in 12s 750ms
4:31:23 AM: Processing the tree
4:31:23 AM: Running with a level=easy. Settings:
4:31:23 AM: Use cached crunchbase data: true
4:31:23 AM: Use cached images data: true
4:31:23 AM: Use cached twitter data: true
4:31:23 AM: Use cached github basic stats: true
4:31:23 AM: Use cached github start dates: true
4:31:23 AM: Use cached best practices: true
4:31:23 AM: Fetching crunchbase entries
4:31:23 AM: ................................................................................
4:31:23 AM: ................................................................................
4:31:23 AM: ................................................................................
4:31:23 AM: ................................................................................
4:31:23 AM: ................................................................................
4:31:23 AM: ........................................................*
4:31:23 AM: Fetching github entries
4:31:23 AM: ................................................................................
4:31:23 AM: ................................................................................
4:31:23 AM: ..................................*********************.........................
4:31:23 AM: ................................................................................
4:31:23 AM: ....................................................................*********
4:31:23 AM: Fetching start date entries
4:31:23 AM: ................................................................................
4:31:23 AM: ................................................................................
4:31:23 AM: ............................................***********.........................
4:31:23 AM: ................................................................................
4:31:23 AM: ..........................................................*******************
4:31:23 AM: Fetching images
4:31:23 AM: got image entries
4:31:23 AM: Hash for Prefect is prefect-2
4:31:23 AM: ................................................................................
4:31:23 AM: ............**....*.............................................................
4:31:23 AM: ................................................................................
4:31:23 AM: ................................................................................
4:31:23 AM: ................................................................................
4:31:23 AM: ................................................................................
4:31:23 AM: ..
4:31:23 AM: Fetching last tweet dates
4:31:23 AM: Fetching best practices
4:31:23 AM: ................................................................................
4:31:23 AM: ................................................................................
4:31:23 AM: ................................................................................
4:31:23 AM: ................................................................................
4:31:23 AM: ................................................
4:31:23 AM: Fetching CLOMonitor data
4:31:23 AM: Processing the tree
4:31:23 AM: saving!
4:31:23 AM: Hash for Prefect is prefect-2
4:31:23 AM: Warning: Open Platform for Enterprise AI (OPEA) has a twitter https://twitter.com/opeadev which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: Dragonfly has a twitter https://twitter.com/dragonfly_oss which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: Vald has a twitter https://twitter.com/vdaas_vald which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: FastTrackML has a twitter https://twitter.com/oss_gr which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: Mathesar has a twitter https://twitter.com/centerofci which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: RWKV has a twitter https://twitter.com/RWKV_AI which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: Langchain has a twitter https://twitter.com/LangChainAI which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: Haystack has a twitter https://twitter.com/deepset_ai which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: Armada has a twitter https://twitter.com/oss_gr which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: envd has a twitter https://twitter.com/TensorChord which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: NVIDIA Corporation has a twitter https://twitter.com/nvidia which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: Advanced Micro Devices (AMD) has a twitter https://twitter.com/AMD which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: Alibaba Cloud (Singapore) Private LTD has a twitter https://twitter.com/AlibabaB2B which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: Douyin Vision Co., Ltd. has a twitter https://twitter.com/BytedanceTalk which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: Fujitsu Limited has a twitter https://twitter.com/Fujitsu_Global which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: GitCode has a twitter https://twitter.com/turingbook which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: Neo4j, Inc. has a twitter https://twitter.com/neo4j which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: New Relic, Inc. has a twitter https://twitter.com/newrelic which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: Posit has a twitter https://twitter.com/posit_pbc which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: AI for People has a twitter https://twitter.com/AIforPeople which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: Ambianic.ai has a twitter https://twitter.com/ambianicai which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: Columbia University has a twitter https://twitter.com/Columbia which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: Common Crawl Foundation has a twitter https://twitter.com/commoncrawl which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: Ersilia Open Source Initiative has a twitter https://twitter.com/ersiliaio which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: Ewha Institute for Biomedical Law & Ethics has a twitter https://twitter.com/cybercampus which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: Montreal AI Ethics Institute has a twitter https://twitter.com/mtlaiethics which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: NIPA has a twitter https://twitter.com/NIPAkr which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: One Fact Foundation has a twitter https://twitter.com/onefact_org which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: Pranveer Singh Institute Of Technology has a twitter https://twitter.com/PSITKanpur2004 which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: Rensselaer Polytechnic Institute has a twitter https://twitter.com/rpi which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: Transport for NSW has a twitter https://twitter.com/TransportforNSW which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: University of Michigan has a twitter https://twitter.com/UMich which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: Wikimedia Deutschland e. V. has a twitter https://twitter.com/WikimediaDE which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: World Ethical Data Foundation has a twitter https://twitter.com/WEDF_foundation which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: BasicAI(hosting) has a twitter https://twitter.com/BasicAIteam which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: Fujitsu Limited(hosting) has a twitter https://twitter.com/Fujitsu_Global which is invalid or we just can not fetch its tweets
4:31:23 AM: Warning: Owkin(hosting) has a twitter https://twitter.com/owkinscience which is invalid or we just can not fetch its tweets
4:31:23 AM: Fetching members from LF AI & Data Member Company category
4:31:23 AM: Processing the tree
4:31:23 AM: Assigning Premier membership on DataHub (LinkedIn) because its parent Microsoft has Premier membership
4:31:23 AM: Assigning Premier membership on Brooklin (LinkedIn) because its parent Microsoft has Premier membership
4:31:23 AM: Assigning Premier membership on Azkaban (LinkedIn) because its parent Microsoft has Premier membership
4:31:23 AM: Assigning Premier membership on LinkedIn (hosting) (LinkedIn) because its parent Microsoft has Premier membership
4:31:23 AM: Hash for Chaitanya Bharathi Institute of Technology is chaitanya-bharathi-institute-of-technology-2
4:31:23 AM: Hash for Fast.ai is fast-ai-2
4:31:23 AM: Hash for Great Expectations is great-expectations-2
4:31:23 AM: Hash for ML Perf is ml-perf-2
4:31:23 AM: Hash for PipelineAI is pipeline-ai-2
4:31:23 AM: Hash for Prefect is prefect-2
4:31:23 AM: Hash for Redash is redash-2
4:31:23 AM: Hash for Wikimedia Deutschland e.V. is wikimedia-deutschland-e-v-2
4:31:23 AM: {
4:31:23 AM: name: 'Accord.NET',
4:31:23 AM: homepage_url: 'http://accord-framework.net/',
4:31:23 AM: logo: 'accord-net.svg',
4:31:23 AM: github_data: {
4:31:23 AM: languages: [
4:31:23 AM: [Object], [Object],
4:31:23 AM: [Object], [Object],
4:31:23 AM: [Object], [Object],
4:31:23 AM: [Object], [Object],
4:31:23 AM: [Object], [Object],
4:31:23 AM: [Object], [Object],
4:31:23 AM: [Object], [Object],
4:31:23 AM: [Object]
4:31:23 AM: ],
4:31:23 AM: contributions: '0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0',
4:31:23 AM: firstWeek: '2022-11-27Z',
4:31:23 AM: stars: 4404,
4:31:23 AM: license: 'GNU Lesser General Public License v2.1',
4:31:23 AM: description: 'Machine learning, computer vision, statistics and general scientific computing for .NET',
4:31:23 AM: latest_commit_date: '2020-11-18T19:53:01Z',
4:31:23 AM: latest_commit_link: '/accord-net/framework/commit/1ab0cc0ba55bcc3d46f20e7bbe7224b58cd01854',
4:31:23 AM: release_date: '2017-10-19T21:00:56Z',
4:31:23 AM: contributors_count: 98,
4:31:23 AM: },
4:31:23 AM: repos: [ { url: 'https://github.com/accord-net/framework', stars: 4404 } ],
4:31:23 AM: github_start_commit_data: {
4:31:23 AM: start_commit_link: '/accord-net/framework/commit/5e834efc89c071bb32c84f41124d5c9a5bbfe1e3',
4:31:23 AM: start_date: '2012-04-08T14:05:58Z'
4:31:23 AM: },
4:31:23 AM: image_data: {
4:31:23 AM: fileName: 'accord-net.svg',
4:31:23 AM: hash: 'tdvFWfo6hfZQqHqjscAtcoZWk6qztZFzBTVqKWUjYig='
4:31:23 AM: },
4:31:23 AM: firstCommitDate: '2012-04-08T14:05:58Z',
4:31:23 AM: latestCommitDate: '2020-11-18T19:53:01Z',
4:31:23 AM: releaseDate: '2017-10-19T21:00:56Z',
4:31:23 AM: commitsThisYear: 0,
4:31:23 AM: contributorsCount: 98,
4:31:23 AM: language: 'C#',
4:31:23 AM: stars: 4404,
4:31:23 AM: license: 'GNU Lesser General Public License v2.1',
4:31:23 AM: headquarters: 'Grenoble, France',
4:31:23 AM: description: 'Machine learning, computer vision, statistics and general scientific computing for .NET',
4:31:23 AM: organization: 'Accord.NET Framework',
4:31:23 AM: crunchbaseData: {
4:31:23 AM: name: 'Accord.NET Framework',
4:31:23 AM: description: 'Machine Learning Framework',
4:31:23 AM: homepage: 'http://accord-framework.net/',
4:31:23 AM: city: 'Grenoble',
4:31:23 AM: region: 'Rhone-Alpes',
4:31:23 AM: country: 'France',
4:31:23 AM: twitter: null,
4:31:23 AM: linkedin: null,
4:31:23 AM: acquisitions: [],
4:31:23 AM: parents: [],
4:31:23 AM: stockExchange: null,
4:31:23 AM: company_type: 'Non Profit',
4:31:23 AM: industries: [
4:31:23 AM: 'Analytics',
4:31:23 AM: 'Artificial Intelligence',
4:31:23 AM: 'Hardware',
4:31:23 AM: 'Machine Learning'
4:31:23 AM: ],
4:31:23 AM: numEmployeesMin: null,
4:31:23 AM: numEmployeesMax: null
4:31:23 AM: },
4:31:23 AM: path: 'Machine Learning / Framework',
4:31:23 AM: landscape: 'Machine Learning / Framework',
4:31:23 AM: category: 'Machine Learning',
4:31:23 AM: amount: 'N/A',
4:31:23 AM: oss: true,
4:31:23 AM: href: 'logos/accord-net.svg',
4:31:23 AM: bestPracticeBadgeId: false,
4:31:23 AM: bestPracticePercentage: null,
4:31:23 AM: industries: [
4:31:23 AM: 'Analytics',
4:31:23 AM: 'Artificial Intelligence',
4:31:23 AM: 'Hardware',
4:31:23 AM: 'Machine Learning'
4:31:23 AM: ],
4:31:23 AM: starsPresent: true,
4:31:23 AM: starsAsText: '4,404',
4:31:23 AM: marketCapPresent: false,
4:31:23 AM: marketCapAsText: 'N/A',
4:31:23 AM: id: 'accord-net',
4:31:23 AM: flatName: 'Accord.NET',
4:31:23 AM: member: false,
4:31:23 AM: relation: false,
4:31:23 AM: isSubsidiaryProject: false
4:31:23 AM: } 2020-11-18T19:53:01Z
4:31:23 AM: [
4:31:23 AM: 'Community Data License Agreement (CDLA)',
4:31:23 AM: 'PlaNet',
4:31:23 AM: 'Generic Neural Elastic Search (GNES)',
4:31:23 AM: 'PredictionIO',
4:31:23 AM: 'ELI5',
4:31:23 AM: 'BERT',
4:31:23 AM: 'Nauta',
4:31:23 AM: 'DAWNBench',
4:31:23 AM: 'AresDB',
4:31:23 AM: 'dotmesh',
4:31:23 AM: 'Audit AI',
4:31:23 AM: 'euler',
4:31:23 AM: 'Clipper',
4:31:23 AM: 'Accord.NET',
4:31:23 AM: 'Shogun',
4:31:23 AM: 'DELTA',
4:31:23 AM: 'BeakerX',
4:31:23 AM: 'PixieDust',
4:31:23 AM: 'TreeInterpreter',
4:31:23 AM: 'Cyclone',
4:31:23 AM: 'Lucid',
4:31:23 AM: 'XLM',
4:31:23 AM: 'Chainer RL',
4:31:23 AM: 'ForestFlow',
4:31:23 AM: 'uReplicator',
4:31:23 AM: 'Elastic Deep Learning (EDL)',
4:31:23 AM: 'Kashgari',
4:31:23 AM: 'DataPractices',
4:31:23 AM: 'X-DeepLearning',
4:31:23 AM: 'LIME',
4:31:23 AM: 'Model Asset eXchange (MAX)',
4:31:23 AM: 'TransmogrifAI',
4:31:23 AM: 'OpenBytes',
4:31:23 AM: 'DeepLIFT',
4:31:23 AM: 'Onepanel',
4:31:23 AM: 'DeepSpeech',
4:31:23 AM: 'Lucene',
4:31:23 AM: 'Turi Create',
4:31:23 AM: 'Visual Object Tagging Tool (VoTT)',
4:31:23 AM: 'Acumos',
4:31:23 AM: 'Skater',
4:31:23 AM: 'Catalyst',
4:31:23 AM: 'SKIP Language',
4:31:23 AM: 'SQLFlow',
4:31:23 AM: 'Advertorch',
4:31:23 AM: 'xLearn',
4:31:23 AM: 'Neuropod',
4:31:23 AM: 'AdvBox',
4:31:23 AM: 'RCloud',
4:31:23 AM: 'Neo-AI',
4:31:23 AM: 'Embedded Learning Library',
4:31:23 AM: 'Stable Baselines',
4:31:23 AM: 'talos',
4:31:23 AM: 'LabelImg',
4:31:23 AM: 'MMdnn',
4:31:23 AM: 'CNTK',
4:31:23 AM: 'Machine Learning eXchange',
4:31:23 AM: 'Singularity',
4:31:23 AM: 'Chainer',
4:31:23 AM: 'PyText',
4:31:23 AM: 'Pipeline.ai',
4:31:23 AM: 'Apache Bahir',
4:31:23 AM: 'NLP Architect',
4:31:23 AM: 'AllenNLP',
4:31:23 AM: 'Angel-ML',
4:31:23 AM: 'SEED RL',
4:31:23 AM: 'Coach',
4:31:23 AM: 'Gluon-NLP',
4:31:23 AM: 'DeepMind Lab',
4:31:23 AM: 'SEAL',
4:31:23 AM: 'MXNet',
4:31:23 AM: 'OpenAI Gym',
4:31:23 AM: 'MindMeld',
4:31:23 AM: 'CleverHans',
4:31:23 AM: 'Petastorm',
4:31:23 AM: 'Hawq',
4:31:23 AM: 'TF Encrypted',
4:31:23 AM: 'faust',
4:31:23 AM: 'Cortex',
4:31:23 AM: 'OpenDataology',
4:31:23 AM: 'YouTokenToMe',
4:31:23 AM: 'ALBERT',
4:31:23 AM: 'Adlik',
4:31:23 AM: '1chipML',
4:31:23 AM: 'Neural Network Distiller',
4:31:23 AM: 'Labelbox',
4:31:23 AM: 'Facets',
4:31:23 AM: 'OpenNN',
4:31:23 AM: 'Pilosa',
4:31:23 AM: 'Orchest',
4:31:23 AM: 'Model Server for Apache MXNet',
4:31:23 AM: 'LASER',
4:31:23 AM: 'Dopamine',
4:31:23 AM: 'MindSpore',
4:31:23 AM: 'HE Lib',
4:31:23 AM: 'd6tflow',
4:31:23 AM: 'Sonnet',
4:31:23 AM: 'Plaid ML',
4:31:23 AM: 'Nyoka',
4:31:23 AM: 'doccano',
4:31:23 AM: ... 253 more items
4:31:23 AM: ]
4:31:23 AM: ncc: Version 0.34.0
4:31:23 AM: ncc: Compiling file index.js into CJS
4:31:23 AM: ncc: Version 0.34.0
4:31:23 AM: ncc: Compiling file index.js into CJS
4:31:23 AM: ncc: Version 0.34.0
4:31:23 AM: ncc: Compiling file index.js into CJS
4:31:23 AM: Development server running at http://127.0.0.1:4000/
4:31:23 AM: Running integration-test,check-landscape,render-landscape,funding in parallel
4:31:23 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:31:23 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:31:23 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:31:23 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:31:23 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:31:23 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:31:23 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:31:23 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:31:23 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:31:23 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:31:23 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:31:23 AM: api request starting... /api/ids?category=&project=&license=&organization=accord-net-framework&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:31:23 AM: api request starting... /api/ids?category=&project=&license=&organization=microsoft&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:31:23 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:31:23 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:31:23 AM: Task: integration-test PASS specs/main.spec.js (9.621s)
4:31:23 AM: Main test
4:31:23 AM: I visit a main page and have all required elements
4:31:23 AM: ✓ I can open a page (1508ms)
4:31:23 AM: ✓ A proper header is present (5ms)
4:31:23 AM: ✓ Group headers are ok (3ms)
4:31:23 AM: ✓ I see a You are viewing text (2ms)
4:31:23 AM: ✓ A proper card is present (3ms)
4:31:23 AM: ✓ If I click on a card, I see a modal dialog (327ms)
4:31:23 AM: ✓ Closing a browser (30ms)
4:31:23 AM: Landscape Test
4:31:23 AM: ✓ I visit LF AI & Data Members landscape page and have all required elements, elements are clickable (991ms)
4:31:23 AM: ✓ Closing a browser (24ms)
4:31:23 AM: ✓ I visit Companies Hosting Projects landscape page and have all required elements, elements are clickable (697ms)
4:31:23 AM: ✓ Closing a browser (19ms)
4:31:23 AM: I visit a main landscape page and have all required elements
4:31:23 AM: ✓ I open a landscape page and wait for it to load (1885ms)
4:31:23 AM: ✓ When I click on an item the modal is open (86ms)
4:31:23 AM: ✓ If I would straight open the url with a selected id, a modal appears (2048ms)
4:31:23 AM: ✓ Closing a browser (32ms)
4:31:23 AM: Filtering by organization
4:31:23 AM: ✓ Checking we see Accord.NET when filtering by organization Accord.NET Framework (607ms)
4:31:23 AM: ✓ Checking we don't see Accord.NET when filtering by organization Microsoft (285ms)
✓ Closing a browser (22ms)
4:31:23 AM: PASS specs/tools/actualTwitter.spec.js
4:31:23 AM: Twitter URL
4:31:23 AM: when crunchbase data not set
4:31:23 AM: ✓ returns URL from node (2ms)
4:31:23 AM: when node does not have twitter URL
4:31:23 AM: ✓ returns URL from node
4:31:23 AM: when node has twitter URL set to null
4:31:23 AM: ✓ returns undefined (1ms)
4:31:23 AM: when both node and crunchbase have twitter URL
4:31:23 AM: ✓ returns URL from node
4:31:23 AM: when twitter URL is not set anywhere
4:31:23 AM: ✓ returns undefined (1ms)
4:31:23 AM: cleaning up twitter URL
4:31:23 AM: ✓ replaces http with https
4:31:23 AM: ✓ removes www
4:31:23 AM: ✓ query string (1ms)
4:31:23 AM: Test Suites: 2 passed, 2 total
4:31:23 AM: Tests: 26 passed, 26 total
4:31:23 AM: Snapshots: 0 total
4:31:23 AM: Time: 9.855s
4:31:23 AM: Task: check-landscape
4:31:23 AM: Task: render-landscape visiting http://localhost:4000/fullscreen?version=2025-01-11T04:30:18Z 8df7a91&scale=false&pdf
4:31:23 AM: visiting http://localhost:4000/fullscreen?version=2025-01-11T04:30:18Z 8df7a91&scale=false&pdf
4:31:23 AM: Task: funding From https://github.com/lfai/lfai-landscape
4:31:23 AM: * [new branch] NSouthernLF-patch-1 -> github/NSouthernLF-patch-1
4:31:23 AM: * [new branch] NSouthernLF-patch-2 -> github/NSouthernLF-patch-2
4:31:23 AM: * [new branch] NSouthernLF-patch-3 -> github/NSouthernLF-patch-3
4:31:23 AM: * [new branch] NSouthernLF-patch-3-1 -> github/NSouthernLF-patch-3-1
4:31:23 AM: * [new branch] create-pull-request/patch-1693628094 -> github/create-pull-request/patch-1693628094
4:31:23 AM: * [new branch] create-pull-request/patch-1693714471 -> github/create-pull-request/patch-1693714471
4:31:23 AM: * [new branch] create-pull-request/patch-1703668428 -> github/create-pull-request/patch-1703668428
4:31:23 AM: * [new branch] main -> github/main
4:31:23 AM: * [new branch] revert-303-main -> github/revert-303-main
4:31:23 AM: * [new branch] web-landscape-2021-10-01T22-18 -> github/web-landscape-2021-10-01T22-18
4:31:23 AM: * [new branch] web-landscape-2021-10-12T16-06 -> github/web-landscape-2021-10-12T16-06
4:31:23 AM: * [new branch] web-landscape-2021-10-18T19-46 -> github/web-landscape-2021-10-18T19-46
4:31:23 AM: * [new branch] web-landscape-2021-10-20T18-58 -> github/web-landscape-2021-10-20T18-58
4:31:26 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:31:27 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:31:27 AM: * Documentation: https://help.ubuntu.com
4:31:27 AM: * Management: https://landscape.canonical.com
4:31:27 AM: * Support: https://ubuntu.com/advantage
4:31:27 AM: System information as of Sat Jan 11 04:31:27 UTC 2025
4:31:27 AM: System load: 0.87
4:31:27 AM: Usage of /: 39.4% of 217.51GB
4:31:27 AM: Memory usage: 20%
4:31:27 AM: Swap usage: 0%
4:31:27 AM: Processes: 619
4:31:27 AM: Users logged in: 1
4:31:27 AM: IPv4 address for bond0: 147.75.199.15
4:31:27 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:31:27 AM: IPv4 address for docker0: 172.17.0.1
4:31:27 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:31:27 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:31:27 AM: 86 updates can be applied immediately.
4:31:27 AM: 2 of these updates are standard security updates.
4:31:27 AM: To see these additional updates run: apt list --upgradable
4:31:27 AM: New release '22.04.5 LTS' available.
4:31:27 AM: Run 'do-release-upgrade' to upgrade to it.
4:31:27 AM: 2 updates could not be installed automatically. For more details,
4:31:27 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:31:27 AM: *** System restart required ***
4:31:27 AM: Remote build done!
4:31:27 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:31:27 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:31:27 AM: * Documentation: https://help.ubuntu.com
4:31:27 AM: * Management: https://landscape.canonical.com
4:31:27 AM: * Support: https://ubuntu.com/advantage
4:31:27 AM: System information as of Sat Jan 11 04:29:05 UTC 2025
4:31:27 AM: System load: 0.24
4:31:27 AM: Usage of /: 38.9% of 217.51GB
4:31:27 AM: Memory usage: 19%
4:31:27 AM: Swap usage: 0%
4:31:27 AM: Processes: 636
4:31:27 AM: Users logged in: 1
4:31:27 AM: IPv4 address for bond0: 147.75.199.15
4:31:27 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:31:27 AM: IPv4 address for docker0: 172.17.0.1
4:31:27 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:31:27 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:31:27 AM: 86 updates can be applied immediately.
4:31:27 AM: 2 of these updates are standard security updates.
4:31:27 AM: To see these additional updates run: apt list --upgradable
4:31:27 AM: New release '22.04.5 LTS' available.
4:31:27 AM: Run 'do-release-upgrade' to upgrade to it.
4:31:27 AM: 2 updates could not be installed automatically. For more details,
4:31:27 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:31:27 AM: *** System restart required ***
4:31:27 AM: /opt/buildhome/.nvm/nvm.sh
4:31:27 AM: .:
4:31:27 AM: ADOPTERS.md jest.config.js netlify.toml tools
4:31:27 AM: bin landscapes_dev package.json update_server
4:31:27 AM: build.sh landscapes.sh postcss.config.js _yarn.lock
4:31:27 AM: code-of-conduct.md landscapes.yml README.md yarn.lock
4:31:27 AM: files LICENSE server.js
4:31:27 AM: _headers netlify specs
4:31:27 AM: INSTALL.md netlify.md src
4:31:27 AM: v18.3
4:31:27 AM: Downloading and installing node v18.3.0...
4:31:27 AM: Computing checksum with sha256sum
4:31:27 AM: Checksums matched!
4:31:27 AM: Now using node v18.3.0 (npm v8.11.0)
4:31:27 AM: Now using node v18.3.0 (npm v8.11.0)
4:31:27 AM: npm WARN config global `--global`, `--local` are deprecated. Use `--location=global` instead.
4:31:27 AM:
4:31:27 AM: added 3 packages, and audited 4 packages in 487ms
4:31:27 AM: found 0 vulnerabilities
4:31:27 AM: npm WARN config global `--global`, `--local` are deprecated. Use `--location=global` instead.
4:31:27 AM:
4:31:27 AM: removed 12 packages, changed 105 packages, and audited 263 packages in 2s
4:31:27 AM: 27 packages are looking for funding
4:31:27 AM: run `npm fund` for details
4:31:27 AM: found 0 vulnerabilities
4:31:27 AM: added 1 package in 998ms
4:31:27 AM: YN0000: ┌ Resolution step
4:31:27 AM: YN0000: └ Completed
4:31:27 AM: YN0000: ┌ Fetch step
4:31:27 AM: YN0013: │ 2 packages were already cached, 808 had to be fetched
4:31:27 AM: YN0000: └ Completed in 5s 830ms
4:31:27 AM: YN0000: ┌ Link step
4:31:27 AM: YN0000: │ ESM support for PnP uses the experimental loader API and is therefore experimental
4:31:27 AM: YN0007: │ puppeteer@npm:14.2.1 must be built because it never has been before or the last one failed
4:31:27 AM: YN0007: │ puppeteer@npm:13.2.0 must be built because it never has been before or the last one failed
4:31:27 AM: YN0007: │ yarn@npm:1.22.18 must be built because it never has been before or the last one failed
4:31:27 AM: YN0000: │ puppeteer@npm:14.2.1 STDERR
4:31:27 AM: YN0000: │ puppeteer@npm:13.2.0 STDERR
4:31:27 AM: YN0000: │ puppeteer@npm:13.2.0 STDOUT Chromium (961656) downloaded to /opt/repo/packageRemote/.yarn/unplugged/puppeteer-npm-13.2.0-5595d43df8/node_modules/puppeteer/.local-chromium/linux-961656
4:31:27 AM: YN0000: │ puppeteer@npm:14.2.1 STDOUT Chromium (1002410) downloaded to /opt/repo/packageRemote/.yarn/unplugged/puppeteer-npm-14.2.1-e2757bbf6c/node_modules/puppeteer/.local-chromium/linux-1002410
4:31:27 AM: YN0000: └ Completed in 6s 513ms
4:31:27 AM: YN0000: Done with warnings in 12s 750ms
4:31:27 AM: Processing the tree
4:31:27 AM: Running with a level=easy. Settings:
4:31:27 AM: Use cached crunchbase data: true
4:31:27 AM: Use cached images data: true
4:31:27 AM: Use cached twitter data: true
4:31:27 AM: Use cached github basic stats: true
4:31:27 AM: Use cached github start dates: true
4:31:27 AM: Use cached best practices: true
4:31:27 AM: Fetching crunchbase entries
4:31:27 AM: ................................................................................
4:31:27 AM: ................................................................................
4:31:27 AM: ................................................................................
4:31:27 AM: ................................................................................
4:31:27 AM: ................................................................................
4:31:27 AM: ........................................................*
4:31:27 AM: Fetching github entries
4:31:27 AM: ................................................................................
4:31:27 AM: ................................................................................
4:31:27 AM: ..................................*********************.........................
4:31:27 AM: ................................................................................
4:31:27 AM: ....................................................................*********
4:31:27 AM: Fetching start date entries
4:31:27 AM: ................................................................................
4:31:27 AM: ................................................................................
4:31:27 AM: ............................................***********.........................
4:31:27 AM: ................................................................................
4:31:27 AM: ..........................................................*******************
4:31:27 AM: Fetching images
4:31:27 AM: got image entries
4:31:27 AM: Hash for Prefect is prefect-2
4:31:27 AM: ................................................................................
4:31:27 AM: ............**....*.............................................................
4:31:27 AM: ................................................................................
4:31:27 AM: ................................................................................
4:31:27 AM: ................................................................................
4:31:27 AM: ................................................................................
4:31:27 AM: ..
4:31:27 AM: Fetching last tweet dates
4:31:27 AM: Fetching best practices
4:31:27 AM: ................................................................................
4:31:27 AM: ................................................................................
4:31:27 AM: ................................................................................
4:31:27 AM: ................................................................................
4:31:27 AM: ................................................
4:31:27 AM: Fetching CLOMonitor data
4:31:27 AM: Processing the tree
4:31:27 AM: saving!
4:31:27 AM: Hash for Prefect is prefect-2
4:31:27 AM: Warning: Open Platform for Enterprise AI (OPEA) has a twitter https://twitter.com/opeadev which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: Dragonfly has a twitter https://twitter.com/dragonfly_oss which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: Vald has a twitter https://twitter.com/vdaas_vald which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: FastTrackML has a twitter https://twitter.com/oss_gr which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: Mathesar has a twitter https://twitter.com/centerofci which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: RWKV has a twitter https://twitter.com/RWKV_AI which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: Langchain has a twitter https://twitter.com/LangChainAI which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: Haystack has a twitter https://twitter.com/deepset_ai which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: Armada has a twitter https://twitter.com/oss_gr which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: envd has a twitter https://twitter.com/TensorChord which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: NVIDIA Corporation has a twitter https://twitter.com/nvidia which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: Advanced Micro Devices (AMD) has a twitter https://twitter.com/AMD which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: Alibaba Cloud (Singapore) Private LTD has a twitter https://twitter.com/AlibabaB2B which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: Douyin Vision Co., Ltd. has a twitter https://twitter.com/BytedanceTalk which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: Fujitsu Limited has a twitter https://twitter.com/Fujitsu_Global which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: GitCode has a twitter https://twitter.com/turingbook which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: Neo4j, Inc. has a twitter https://twitter.com/neo4j which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: New Relic, Inc. has a twitter https://twitter.com/newrelic which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: Posit has a twitter https://twitter.com/posit_pbc which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: AI for People has a twitter https://twitter.com/AIforPeople which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: Ambianic.ai has a twitter https://twitter.com/ambianicai which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: Columbia University has a twitter https://twitter.com/Columbia which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: Common Crawl Foundation has a twitter https://twitter.com/commoncrawl which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: Ersilia Open Source Initiative has a twitter https://twitter.com/ersiliaio which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: Ewha Institute for Biomedical Law & Ethics has a twitter https://twitter.com/cybercampus which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: Montreal AI Ethics Institute has a twitter https://twitter.com/mtlaiethics which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: NIPA has a twitter https://twitter.com/NIPAkr which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: One Fact Foundation has a twitter https://twitter.com/onefact_org which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: Pranveer Singh Institute Of Technology has a twitter https://twitter.com/PSITKanpur2004 which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: Rensselaer Polytechnic Institute has a twitter https://twitter.com/rpi which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: Transport for NSW has a twitter https://twitter.com/TransportforNSW which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: University of Michigan has a twitter https://twitter.com/UMich which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: Wikimedia Deutschland e. V. has a twitter https://twitter.com/WikimediaDE which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: World Ethical Data Foundation has a twitter https://twitter.com/WEDF_foundation which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: BasicAI(hosting) has a twitter https://twitter.com/BasicAIteam which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: Fujitsu Limited(hosting) has a twitter https://twitter.com/Fujitsu_Global which is invalid or we just can not fetch its tweets
4:31:27 AM: Warning: Owkin(hosting) has a twitter https://twitter.com/owkinscience which is invalid or we just can not fetch its tweets
4:31:27 AM: Fetching members from LF AI & Data Member Company category
4:31:27 AM: Processing the tree
4:31:27 AM: Assigning Premier membership on DataHub (LinkedIn) because its parent Microsoft has Premier membership
4:31:27 AM: Assigning Premier membership on Brooklin (LinkedIn) because its parent Microsoft has Premier membership
4:31:27 AM: Assigning Premier membership on Azkaban (LinkedIn) because its parent Microsoft has Premier membership
4:31:27 AM: Assigning Premier membership on LinkedIn (hosting) (LinkedIn) because its parent Microsoft has Premier membership
4:31:27 AM: Hash for Chaitanya Bharathi Institute of Technology is chaitanya-bharathi-institute-of-technology-2
4:31:27 AM: Hash for Fast.ai is fast-ai-2
4:31:27 AM: Hash for Great Expectations is great-expectations-2
4:31:27 AM: Hash for ML Perf is ml-perf-2
4:31:27 AM: Hash for PipelineAI is pipeline-ai-2
4:31:27 AM: Hash for Prefect is prefect-2
4:31:27 AM: Hash for Redash is redash-2
4:31:27 AM: Hash for Wikimedia Deutschland e.V. is wikimedia-deutschland-e-v-2
4:31:27 AM: {
4:31:27 AM: name: 'Accord.NET',
4:31:27 AM: homepage_url: 'http://accord-framework.net/',
4:31:27 AM: logo: 'accord-net.svg',
4:31:27 AM: github_data: {
4:31:27 AM: languages: [
4:31:27 AM: [Object], [Object],
4:31:27 AM: [Object], [Object],
4:31:27 AM: [Object], [Object],
4:31:27 AM: [Object], [Object],
4:31:27 AM: [Object], [Object],
4:31:27 AM: [Object], [Object],
4:31:27 AM: [Object], [Object],
4:31:27 AM: [Object]
4:31:27 AM: ],
4:31:27 AM: contributions: '0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0',
4:31:27 AM: firstWeek: '2022-11-27Z',
4:31:27 AM: stars: 4404,
4:31:27 AM: license: 'GNU Lesser General Public License v2.1',
4:31:27 AM: description: 'Machine learning, computer vision, statistics and general scientific computing for .NET',
4:31:27 AM: latest_commit_date: '2020-11-18T19:53:01Z',
4:31:27 AM: latest_commit_link: '/accord-net/framework/commit/1ab0cc0ba55bcc3d46f20e7bbe7224b58cd01854',
4:31:27 AM: release_date: '2017-10-19T21:00:56Z',
4:31:27 AM: contributors_count: 98,
4:31:27 AM: },
4:31:27 AM: repos: [ { url: 'https://github.com/accord-net/framework', stars: 4404 } ],
4:31:27 AM: github_start_commit_data: {
4:31:27 AM: start_commit_link: '/accord-net/framework/commit/5e834efc89c071bb32c84f41124d5c9a5bbfe1e3',
4:31:27 AM: start_date: '2012-04-08T14:05:58Z'
4:31:27 AM: },
4:31:27 AM: image_data: {
4:31:27 AM: fileName: 'accord-net.svg',
4:31:27 AM: hash: 'tdvFWfo6hfZQqHqjscAtcoZWk6qztZFzBTVqKWUjYig='
4:31:27 AM: },
4:31:27 AM: firstCommitDate: '2012-04-08T14:05:58Z',
4:31:27 AM: latestCommitDate: '2020-11-18T19:53:01Z',
4:31:27 AM: releaseDate: '2017-10-19T21:00:56Z',
4:31:27 AM: commitsThisYear: 0,
4:31:27 AM: contributorsCount: 98,
4:31:27 AM: language: 'C#',
4:31:27 AM: stars: 4404,
4:31:27 AM: license: 'GNU Lesser General Public License v2.1',
4:31:27 AM: headquarters: 'Grenoble, France',
4:31:27 AM: description: 'Machine learning, computer vision, statistics and general scientific computing for .NET',
4:31:27 AM: organization: 'Accord.NET Framework',
4:31:27 AM: crunchbaseData: {
4:31:27 AM: name: 'Accord.NET Framework',
4:31:27 AM: description: 'Machine Learning Framework',
4:31:27 AM: homepage: 'http://accord-framework.net/',
4:31:27 AM: city: 'Grenoble',
4:31:27 AM: region: 'Rhone-Alpes',
4:31:27 AM: country: 'France',
4:31:27 AM: twitter: null,
4:31:27 AM: linkedin: null,
4:31:27 AM: acquisitions: [],
4:31:27 AM: parents: [],
4:31:27 AM: stockExchange: null,
4:31:27 AM: company_type: 'Non Profit',
4:31:27 AM: industries: [
4:31:27 AM: 'Analytics',
4:31:27 AM: 'Artificial Intelligence',
4:31:27 AM: 'Hardware',
4:31:27 AM: 'Machine Learning'
4:31:27 AM: ],
4:31:27 AM: numEmployeesMin: null,
4:31:27 AM: numEmployeesMax: null
4:31:27 AM: },
4:31:27 AM: path: 'Machine Learning / Framework',
4:31:27 AM: landscape: 'Machine Learning / Framework',
4:31:27 AM: category: 'Machine Learning',
4:31:27 AM: amount: 'N/A',
4:31:27 AM: oss: true,
4:31:27 AM: href: 'logos/accord-net.svg',
4:31:27 AM: bestPracticeBadgeId: false,
4:31:27 AM: bestPracticePercentage: null,
4:31:27 AM: industries: [
4:31:27 AM: 'Analytics',
4:31:27 AM: 'Artificial Intelligence',
4:31:27 AM: 'Hardware',
4:31:27 AM: 'Machine Learning'
4:31:27 AM: ],
4:31:27 AM: starsPresent: true,
4:31:27 AM: starsAsText: '4,404',
4:31:27 AM: marketCapPresent: false,
4:31:27 AM: marketCapAsText: 'N/A',
4:31:27 AM: id: 'accord-net',
4:31:27 AM: flatName: 'Accord.NET',
4:31:27 AM: member: false,
4:31:27 AM: relation: false,
4:31:27 AM: isSubsidiaryProject: false
4:31:27 AM: } 2020-11-18T19:53:01Z
4:31:27 AM: [
4:31:27 AM: 'Community Data License Agreement (CDLA)',
4:31:27 AM: 'PlaNet',
4:31:27 AM: 'Generic Neural Elastic Search (GNES)',
4:31:27 AM: 'PredictionIO',
4:31:27 AM: 'ELI5',
4:31:27 AM: 'BERT',
4:31:27 AM: 'Nauta',
4:31:27 AM: 'DAWNBench',
4:31:27 AM: 'AresDB',
4:31:27 AM: 'dotmesh',
4:31:27 AM: 'Audit AI',
4:31:27 AM: 'euler',
4:31:27 AM: 'Clipper',
4:31:27 AM: 'Accord.NET',
4:31:27 AM: 'Shogun',
4:31:27 AM: 'DELTA',
4:31:27 AM: 'BeakerX',
4:31:27 AM: 'PixieDust',
4:31:27 AM: 'TreeInterpreter',
4:31:27 AM: 'Cyclone',
4:31:27 AM: 'Lucid',
4:31:27 AM: 'XLM',
4:31:27 AM: 'Chainer RL',
4:31:27 AM: 'ForestFlow',
4:31:27 AM: 'uReplicator',
4:31:27 AM: 'Elastic Deep Learning (EDL)',
4:31:27 AM: 'Kashgari',
4:31:27 AM: 'DataPractices',
4:31:27 AM: 'X-DeepLearning',
4:31:27 AM: 'LIME',
4:31:27 AM: 'Model Asset eXchange (MAX)',
4:31:27 AM: 'TransmogrifAI',
4:31:27 AM: 'OpenBytes',
4:31:27 AM: 'DeepLIFT',
4:31:27 AM: 'Onepanel',
4:31:27 AM: 'DeepSpeech',
4:31:27 AM: 'Lucene',
4:31:27 AM: 'Turi Create',
4:31:27 AM: 'Visual Object Tagging Tool (VoTT)',
4:31:27 AM: 'Acumos',
4:31:27 AM: 'Skater',
4:31:27 AM: 'Catalyst',
4:31:27 AM: 'SKIP Language',
4:31:27 AM: 'SQLFlow',
4:31:27 AM: 'Advertorch',
4:31:27 AM: 'xLearn',
4:31:27 AM: 'Neuropod',
4:31:27 AM: 'AdvBox',
4:31:27 AM: 'RCloud',
4:31:27 AM: 'Neo-AI',
4:31:27 AM: 'Embedded Learning Library',
4:31:27 AM: 'Stable Baselines',
4:31:27 AM: 'talos',
4:31:27 AM: 'LabelImg',
4:31:27 AM: 'MMdnn',
4:31:27 AM: 'CNTK',
4:31:27 AM: 'Machine Learning eXchange',
4:31:27 AM: 'Singularity',
4:31:27 AM: 'Chainer',
4:31:27 AM: 'PyText',
4:31:27 AM: 'Pipeline.ai',
4:31:27 AM: 'Apache Bahir',
4:31:27 AM: 'NLP Architect',
4:31:27 AM: 'AllenNLP',
4:31:27 AM: 'Angel-ML',
4:31:27 AM: 'SEED RL',
4:31:27 AM: 'Coach',
4:31:27 AM: 'Gluon-NLP',
4:31:27 AM: 'DeepMind Lab',
4:31:27 AM: 'SEAL',
4:31:27 AM: 'MXNet',
4:31:27 AM: 'OpenAI Gym',
4:31:27 AM: 'MindMeld',
4:31:27 AM: 'CleverHans',
4:31:27 AM: 'Petastorm',
4:31:27 AM: 'Hawq',
4:31:27 AM: 'TF Encrypted',
4:31:27 AM: 'faust',
4:31:27 AM: 'Cortex',
4:31:27 AM: 'OpenDataology',
4:31:27 AM: 'YouTokenToMe',
4:31:27 AM: 'ALBERT',
4:31:27 AM: 'Adlik',
4:31:27 AM: '1chipML',
4:31:27 AM: 'Neural Network Distiller',
4:31:27 AM: 'Labelbox',
4:31:27 AM: 'Facets',
4:31:27 AM: 'OpenNN',
4:31:27 AM: 'Pilosa',
4:31:27 AM: 'Orchest',
4:31:27 AM: 'Model Server for Apache MXNet',
4:31:27 AM: 'LASER',
4:31:27 AM: 'Dopamine',
4:31:27 AM: 'MindSpore',
4:31:27 AM: 'HE Lib',
4:31:27 AM: 'd6tflow',
4:31:27 AM: 'Sonnet',
4:31:27 AM: 'Plaid ML',
4:31:27 AM: 'Nyoka',
4:31:27 AM: 'doccano',
4:31:27 AM: ... 253 more items
4:31:27 AM: ]
4:31:27 AM: ncc: Version 0.34.0
4:31:27 AM: ncc: Compiling file index.js into CJS
4:31:27 AM: ncc: Version 0.34.0
4:31:27 AM: ncc: Compiling file index.js into CJS
4:31:27 AM: ncc: Version 0.34.0
4:31:27 AM: ncc: Compiling file index.js into CJS
4:31:27 AM: Development server running at http://127.0.0.1:4000/
4:31:27 AM: Running integration-test,check-landscape,render-landscape,funding in parallel
4:31:27 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:31:27 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:31:27 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:31:27 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:31:27 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:31:27 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:31:27 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:31:27 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:31:27 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:31:27 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:31:27 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:31:27 AM: api request starting... /api/ids?category=&project=&license=&organization=accord-net-framework&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:31:27 AM: api request starting... /api/ids?category=&project=&license=&organization=microsoft&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:31:27 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:31:27 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:31:27 AM: Task: integration-test PASS specs/main.spec.js (9.621s)
4:31:27 AM: Main test
4:31:27 AM: I visit a main page and have all required elements
4:31:27 AM: ✓ I can open a page (1508ms)
4:31:27 AM: ✓ A proper header is present (5ms)
4:31:27 AM: ✓ Group headers are ok (3ms)
4:31:27 AM: ✓ I see a You are viewing text (2ms)
4:31:27 AM: ✓ A proper card is present (3ms)
4:31:27 AM: ✓ If I click on a card, I see a modal dialog (327ms)
4:31:27 AM: ✓ Closing a browser (30ms)
4:31:27 AM: Landscape Test
4:31:27 AM: ✓ I visit LF AI & Data Members landscape page and have all required elements, elements are clickable (991ms)
4:31:27 AM: ✓ Closing a browser (24ms)
4:31:27 AM: ✓ I visit Companies Hosting Projects landscape page and have all required elements, elements are clickable (697ms)
4:31:27 AM: ✓ Closing a browser (19ms)
4:31:27 AM: I visit a main landscape page and have all required elements
4:31:27 AM: ✓ I open a landscape page and wait for it to load (1885ms)
4:31:27 AM: ✓ When I click on an item the modal is open (86ms)
4:31:27 AM: ✓ If I would straight open the url with a selected id, a modal appears (2048ms)
4:31:27 AM: ✓ Closing a browser (32ms)
4:31:27 AM: Filtering by organization
4:31:27 AM: ✓ Checking we see Accord.NET when filtering by organization Accord.NET Framework (607ms)
4:31:27 AM: ✓ Checking we don't see Accord.NET when filtering by organization Microsoft (285ms)
✓ Closing a browser (22ms)
4:31:27 AM: PASS specs/tools/actualTwitter.spec.js
4:31:27 AM: Twitter URL
4:31:27 AM: when crunchbase data not set
4:31:27 AM: ✓ returns URL from node (2ms)
4:31:27 AM: when node does not have twitter URL
4:31:27 AM: ✓ returns URL from node
4:31:27 AM: when node has twitter URL set to null
4:31:27 AM: ✓ returns undefined (1ms)
4:31:27 AM: when both node and crunchbase have twitter URL
4:31:27 AM: ✓ returns URL from node
4:31:27 AM: when twitter URL is not set anywhere
4:31:27 AM: ✓ returns undefined (1ms)
4:31:27 AM: cleaning up twitter URL
4:31:27 AM: ✓ replaces http with https
4:31:27 AM: ✓ removes www
4:31:27 AM: ✓ query string (1ms)
4:31:27 AM: Test Suites: 2 passed, 2 total
4:31:27 AM: Tests: 26 passed, 26 total
4:31:27 AM: Snapshots: 0 total
4:31:27 AM: Time: 9.855s
4:31:27 AM: Task: check-landscape
4:31:27 AM: Task: render-landscape visiting http://localhost:4000/fullscreen?version=2025-01-11T04:30:18Z 8df7a91&scale=false&pdf
4:31:27 AM: visiting http://localhost:4000/fullscreen?version=2025-01-11T04:30:18Z 8df7a91&scale=false&pdf
4:31:27 AM: Task: funding From https://github.com/lfai/lfai-landscape
4:31:27 AM: * [new branch] NSouthernLF-patch-1 -> github/NSouthernLF-patch-1
4:31:27 AM: * [new branch] NSouthernLF-patch-2 -> github/NSouthernLF-patch-2
4:31:27 AM: * [new branch] NSouthernLF-patch-3 -> github/NSouthernLF-patch-3
4:31:27 AM: * [new branch] NSouthernLF-patch-3-1 -> github/NSouthernLF-patch-3-1
4:31:27 AM: * [new branch] create-pull-request/patch-1693628094 -> github/create-pull-request/patch-1693628094
4:31:27 AM: * [new branch] create-pull-request/patch-1693714471 -> github/create-pull-request/patch-1693714471
4:31:27 AM: * [new branch] create-pull-request/patch-1703668428 -> github/create-pull-request/patch-1703668428
4:31:27 AM: * [new branch] main -> github/main
4:31:27 AM: * [new branch] revert-303-main -> github/revert-303-main
4:31:27 AM: * [new branch] web-landscape-2021-10-01T22-18 -> github/web-landscape-2021-10-01T22-18
4:31:27 AM: * [new branch] web-landscape-2021-10-12T16-06 -> github/web-landscape-2021-10-12T16-06
4:31:27 AM: * [new branch] web-landscape-2021-10-18T19-46 -> github/web-landscape-2021-10-18T19-46
4:31:27 AM: * [new branch] web-landscape-2021-10-20T18-58 -> github/web-landscape-2021-10-20T18-58
4:31:28 AM: ​
4:31:28 AM: (build.command completed in 2m 30.3s)
4:31:28 AM:
4:31:28 AM: Functions bundling
4:31:28 AM: ────────────────────────────────────────────────────────────────
4:31:28 AM: ​
4:31:28 AM: Packaging Functions from /opt/build/repo/functions directory:
4:31:28 AM: - export.js
4:31:28 AM: - ids.js
4:31:28 AM: - items.js
4:31:28 AM: ​
4:31:33 AM: ​
4:31:33 AM: (Functions bundling completed in 5s)
4:31:33 AM:
4:31:43 AM: (Netlify Build completed in 2m 45s)
4:31:43 AM: Section completed: building
4:31:46 AM: Finished processing build request in 3m24.929s

Deploying

Complete
4:31:33 AM: Deploy site
4:31:33 AM: ────────────────────────────────────────────────────────────────
4:31:33 AM: ​
4:31:33 AM: Starting to deploy site from 'dist'
4:31:33 AM: Calculating files to upload
4:31:36 AM: 10 new file(s) to upload
4:31:36 AM: 3 new function(s) to upload
4:31:42 AM: Section completed: deploying
4:31:42 AM: Site deploy was successfully initiated
4:31:42 AM: ​
4:31:42 AM: (Deploy site completed in 9.4s)

Cleanup

Complete
4:31:43 AM: Netlify Build Complete
4:31:43 AM: ────────────────────────────────────────────────────────────────
4:31:43 AM: ​
4:31:43 AM: Caching artifacts
4:31:43 AM: Started saving build plugins
4:31:43 AM: Finished saving build plugins
4:31:43 AM: Started saving mise cache
4:31:43 AM: Finished saving mise cache
4:31:43 AM: Started saving pip cache
4:31:43 AM: Finished saving pip cache
4:31:43 AM: Started saving emacs cask dependencies
4:31:43 AM: Finished saving emacs cask dependencies
4:31:43 AM: Started saving maven dependencies
4:31:43 AM: Finished saving maven dependencies
4:31:43 AM: Started saving boot dependencies
4:31:43 AM: Finished saving boot dependencies
4:31:43 AM: Started saving rust rustup cache
4:31:43 AM: Finished saving rust rustup cache
4:31:43 AM: Started saving go dependencies
4:31:43 AM: Finished saving go dependencies
4:31:43 AM: Cached Ruby version 2.6.2
4:31:43 AM: Build script success
4:31:45 AM: Uploading Cache of size 218.2MB
4:31:46 AM: Section completed: cleanup

Post-processing

Complete
4:31:43 AM: Skipping form detection
4:31:43 AM: Post processing - header rules
4:31:43 AM: Post processing done
4:31:43 AM: Section completed: postprocessing
4:31:43 AM: Starting post processing
4:31:43 AM: Site is live ✨
4:31:43 AM: Post processing - redirect rules