Skip to main content

Deploy details

Deploy successful for lfailandscape

Update Landscape from LFX 2025-02-07 (#837)

Production: main@d651106

Deploy log

Initializing

Complete
4:29:24 AM: Build ready to start
4:29:37 AM: build-image version: 9c9fb6952e50bb092d4b66daf2368677e5c68e34 (focal)
4:29:37 AM: buildbot version: 9c9fb6952e50bb092d4b66daf2368677e5c68e34
4:29:37 AM: Fetching cached dependencies
4:29:37 AM: Starting to download cache of 216.2MB
4:29:40 AM: Finished downloading cache in 2.329s
4:29:40 AM: Starting to extract cache
4:29:43 AM: Finished extracting cache in 2.998s
4:29:43 AM: Finished fetching cache in 5.452s
4:29:43 AM: Starting to prepare the repo for build
4:29:43 AM: Preparing Git Reference refs/heads/main
4:29:45 AM: Custom build path detected. Proceeding with the specified path: 'netlify'
4:29:45 AM: Custom functions path detected. Proceeding with the specified path: 'functions'
4:29:45 AM: Custom build command detected. Proceeding with the specified command: '(wget --no-check-certificate --no-cache https://raw.githubusercontent.com/cncf/landscapeapp/master/netlify/landscape.js) && node landscape.js'
4:29:45 AM: Custom ignore command detected. Proceeding with the specified command: 'false'
4:29:46 AM: manpath: warning: $PATH not set
4:29:46 AM: Starting to install dependencies
4:29:47 AM: Started restoring cached mise cache
4:29:47 AM: Finished restoring cached mise cache
4:29:48 AM: mise python@3.13.2 install
4:29:48 AM: mise python@3.13.2 download cpython-3.13.2+20250205-x86_64-unknown-linux-gnu-install_only_stripped.tar.gz
4:29:48 AM: mise python@3.13.2 extract cpython-3.13.2+20250205-x86_64-unknown-linux-gnu-install_only_stripped.tar.gz
4:29:48 AM: mise python@3.13.2 python --version
4:29:48 AM: mise python@3.13.2 Python 3.13.2
4:29:48 AM: mise python@3.13.2 installed
4:29:48 AM: Python version set to 3.13
4:29:50 AM: Collecting pipenv
4:29:50 AM: Downloading pipenv-2024.4.1-py3-none-any.whl.metadata (17 kB)
4:29:50 AM: Collecting certifi (from pipenv)
4:29:50 AM: Downloading certifi-2025.1.31-py3-none-any.whl.metadata (2.5 kB)
4:29:50 AM: Collecting packaging>=22 (from pipenv)
4:29:50 AM: Downloading packaging-24.2-py3-none-any.whl.metadata (3.2 kB)
4:29:50 AM: Collecting setuptools>=67 (from pipenv)
4:29:50 AM: Downloading setuptools-75.8.0-py3-none-any.whl.metadata (6.7 kB)
4:29:50 AM: Collecting virtualenv>=20.24.2 (from pipenv)
4:29:50 AM: Downloading virtualenv-20.29.1-py3-none-any.whl.metadata (4.5 kB)
4:29:50 AM: Collecting distlib<1,>=0.3.7 (from virtualenv>=20.24.2->pipenv)
4:29:50 AM: Downloading distlib-0.3.9-py2.py3-none-any.whl.metadata (5.2 kB)
4:29:50 AM: Collecting filelock<4,>=3.12.2 (from virtualenv>=20.24.2->pipenv)
4:29:50 AM: Downloading filelock-3.17.0-py3-none-any.whl.metadata (2.9 kB)
4:29:50 AM: Collecting platformdirs<5,>=3.9.1 (from virtualenv>=20.24.2->pipenv)
4:29:50 AM: Downloading platformdirs-4.3.6-py3-none-any.whl.metadata (11 kB)
4:29:50 AM: Downloading pipenv-2024.4.1-py3-none-any.whl (3.0 MB)
4:29:50 AM: ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 3.0/3.0 MB 73.8 MB/s eta 0:00:00
4:29:50 AM: Downloading packaging-24.2-py3-none-any.whl (65 kB)
4:29:50 AM: Downloading setuptools-75.8.0-py3-none-any.whl (1.2 MB)
4:29:50 AM: ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.2/1.2 MB 77.4 MB/s eta 0:00:00
4:29:50 AM: Downloading virtualenv-20.29.1-py3-none-any.whl (4.3 MB)
4:29:50 AM: ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 4.3/4.3 MB 172.5 MB/s eta 0:00:00
4:29:50 AM: Downloading certifi-2025.1.31-py3-none-any.whl (166 kB)
4:29:50 AM: Downloading distlib-0.3.9-py2.py3-none-any.whl (468 kB)
4:29:50 AM: Downloading filelock-3.17.0-py3-none-any.whl (16 kB)
4:29:51 AM: Downloading platformdirs-4.3.6-py3-none-any.whl (18 kB)
4:29:51 AM: Installing collected packages: distlib, setuptools, platformdirs, packaging, filelock, certifi, virtualenv, pipenv
4:29:52 AM: Successfully installed certifi-2025.1.31 distlib-0.3.9 filelock-3.17.0 packaging-24.2 pipenv-2024.4.1 platformdirs-4.3.6 setuptools-75.8.0 virtualenv-20.29.1
4:29:53 AM: [notice] A new release of pip is available: 24.3.1 -> 25.0
4:29:53 AM: [notice] To update, run: pip install --upgrade pip
4:29:53 AM: Attempting Ruby version 2.6.2, read from environment
4:29:53 AM: Started restoring cached Ruby version
4:29:53 AM: Finished restoring cached Ruby version
4:29:54 AM: Using Ruby version 2.6.2
4:29:54 AM: Started restoring cached go cache
4:29:54 AM: Finished restoring cached go cache
4:29:54 AM: Installing Go version 1.12 (requested 1.12)
4:29:58 AM: go version go1.12 linux/amd64
4:29:59 AM: Using PHP version 8.0
4:30:01 AM: Started restoring cached Node.js version
4:30:02 AM: Finished restoring cached Node.js version
4:30:03 AM: v14.3.0 is already installed.
4:30:03 AM: Now using node v14.3.0 (npm v6.14.5)
4:30:03 AM: Started restoring cached build plugins
4:30:03 AM: Finished restoring cached build plugins
4:30:03 AM: Successfully installed dependencies
4:30:03 AM: Starting build script
4:30:05 AM: Detected 1 framework(s)
4:30:05 AM: "cecil" at version "unknown"
4:30:05 AM: Section completed: initializing

Building

Complete
4:30:06 AM: Netlify Build
4:30:06 AM: ────────────────────────────────────────────────────────────────
4:30:06 AM:
4:30:06 AM: ❯ Version
4:30:06 AM: @netlify/build 29.58.9
4:30:06 AM:
4:30:06 AM: ❯ Flags
4:30:06 AM: accountId: 5a55185e8198766884f04205
4:30:06 AM: baseRelDir: false
4:30:06 AM: buildId: 67a58c2398f67400085b9769
4:30:06 AM: deployId: 67a58c2398f67400085b976b
4:30:06 AM:
4:30:06 AM: ❯ Current directory
4:30:06 AM: /opt/build/repo/netlify
4:30:06 AM:
4:30:06 AM: ❯ Config file
4:30:06 AM: /opt/build/repo/netlify.toml
4:30:06 AM:
4:30:06 AM: ❯ Context
4:30:06 AM: production
4:30:06 AM:
4:30:06 AM: build.command from netlify.toml
4:30:06 AM: ────────────────────────────────────────────────────────────────
4:30:06 AM: ​
4:30:06 AM: $ (wget --no-check-certificate --no-cache https://raw.githubusercontent.com/cncf/landscapeapp/master/netlify/landscape.js) && node landscape.js
4:30:06 AM: Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 185.199.111.133, 185.199.109.133, 185.199.108.133, ...
4:30:06 AM: Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|185.199.111.133|:443... connected.
4:30:07 AM: HTTP request sent, awaiting response... 200 OK
4:30:07 AM: Length: 8750 (8.5K) [text/plain]
4:30:07 AM: Saving to: ‘landscape.js’
4:30:07 AM: 0K ........ 100% 68.0M=0s
4:30:07 AM: 2025-02-07 04:30:07 (68.0 MB/s) - ‘landscape.js’ saved [8750/8750]
4:30:07 AM: We have a secret: c8***75
4:30:07 AM: We have a secret: 8G***pb
4:30:07 AM: We have a secret: 87***eb
4:30:07 AM: We have a secret: gh***7r
4:30:07 AM: starting /opt/build/repo/netlify
4:30:07 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:30:07 AM: Warning: Permanently added '147.75.199.15' (ECDSA) to the list of known hosts.
4:30:08 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:30:08 AM: * Documentation: https://help.ubuntu.com
4:30:08 AM: * Management: https://landscape.canonical.com
4:30:08 AM: * Support: https://ubuntu.com/advantage
4:30:08 AM: System information as of Fri Feb 7 04:30:07 UTC 2025
4:30:08 AM: System load: 0.63
4:30:08 AM: Usage of /: 71.4% of 217.51GB
4:30:08 AM: Memory usage: 18%
4:30:08 AM: Swap usage: 1%
4:30:08 AM: Processes: 626
4:30:08 AM: Users logged in: 1
4:30:08 AM: IPv4 address for bond0: 147.75.199.15
4:30:08 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:30:08 AM: IPv4 address for docker0: 172.17.0.1
4:30:08 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:30:08 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:30:08 AM: 85 updates can be applied immediately.
4:30:08 AM: 10 of these updates are standard security updates.
4:30:08 AM: To see these additional updates run: apt list --upgradable
4:30:08 AM: New release '22.04.5 LTS' available.
4:30:08 AM: Run 'do-release-upgrade' to upgrade to it.
4:30:08 AM: 2 updates could not be installed automatically. For more details,
4:30:08 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:30:08 AM: *** System restart required ***
4:30:08 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:30:08 AM: Warning: Permanently added '147.75.199.15' (ECDSA) to the list of known hosts.
4:30:08 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:30:08 AM: * Documentation: https://help.ubuntu.com
4:30:08 AM: * Management: https://landscape.canonical.com
4:30:08 AM: * Support: https://ubuntu.com/advantage
4:30:08 AM: System information as of Fri Feb 7 04:30:07 UTC 2025
4:30:08 AM: System load: 0.63
4:30:08 AM: Usage of /: 71.4% of 217.51GB
4:30:08 AM: Memory usage: 18%
4:30:08 AM: Swap usage: 1%
4:30:08 AM: Processes: 626
4:30:08 AM: Users logged in: 1
4:30:08 AM: IPv4 address for bond0: 147.75.199.15
4:30:08 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:30:08 AM: IPv4 address for docker0: 172.17.0.1
4:30:08 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:30:08 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:30:08 AM: 85 updates can be applied immediately.
4:30:08 AM: 10 of these updates are standard security updates.
4:30:08 AM: To see these additional updates run: apt list --upgradable
4:30:08 AM: New release '22.04.5 LTS' available.
4:30:08 AM: Run 'do-release-upgrade' to upgrade to it.
4:30:08 AM: 2 updates could not be installed automatically. For more details,
4:30:08 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:30:08 AM: *** System restart required ***
4:30:08 AM: Cloning into 'packageRemote'...
4:30:08 AM: node version: v18.3
4:30:08 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:30:09 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:30:09 AM: * Documentation: https://help.ubuntu.com
4:30:09 AM: * Management: https://landscape.canonical.com
4:30:09 AM: * Support: https://ubuntu.com/advantage
4:30:09 AM: System information as of Fri Feb 7 04:30:08 UTC 2025
4:30:09 AM: System load: 0.63
4:30:09 AM: Usage of /: 71.4% of 217.51GB
4:30:09 AM: Memory usage: 18%
4:30:09 AM: Swap usage: 1%
4:30:09 AM: Processes: 630
4:30:09 AM: Users logged in: 1
4:30:09 AM: IPv4 address for bond0: 147.75.199.15
4:30:09 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:30:09 AM: IPv4 address for docker0: 172.17.0.1
4:30:09 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:30:09 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:30:09 AM: 85 updates can be applied immediately.
4:30:09 AM: 10 of these updates are standard security updates.
4:30:09 AM: To see these additional updates run: apt list --upgradable
4:30:09 AM: New release '22.04.5 LTS' available.
4:30:09 AM: Run 'do-release-upgrade' to upgrade to it.
4:30:09 AM: 2 updates could not be installed automatically. For more details,
4:30:09 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:30:09 AM: *** System restart required ***
4:30:09 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:30:09 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:30:09 AM: * Documentation: https://help.ubuntu.com
4:30:09 AM: * Management: https://landscape.canonical.com
4:30:09 AM: * Support: https://ubuntu.com/advantage
4:30:09 AM: System information as of Fri Feb 7 04:30:08 UTC 2025
4:30:09 AM: System load: 0.63
4:30:09 AM: Usage of /: 71.4% of 217.51GB
4:30:09 AM: Memory usage: 18%
4:30:09 AM: Swap usage: 1%
4:30:09 AM: Processes: 630
4:30:09 AM: Users logged in: 1
4:30:09 AM: IPv4 address for bond0: 147.75.199.15
4:30:09 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:30:09 AM: IPv4 address for docker0: 172.17.0.1
4:30:09 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:30:09 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:30:09 AM: 85 updates can be applied immediately.
4:30:09 AM: 10 of these updates are standard security updates.
4:30:09 AM: To see these additional updates run: apt list --upgradable
4:30:09 AM: New release '22.04.5 LTS' available.
4:30:09 AM: Run 'do-release-upgrade' to upgrade to it.
4:30:09 AM: 2 updates could not be installed automatically. For more details,
4:30:09 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:30:09 AM: *** System restart required ***
4:30:09 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:30:09 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:30:09 AM: * Documentation: https://help.ubuntu.com
4:30:09 AM: * Management: https://landscape.canonical.com
4:30:09 AM: * Support: https://ubuntu.com/advantage
4:30:09 AM: System information as of Fri Feb 7 04:30:09 UTC 2025
4:30:09 AM: System load: 0.63
4:30:09 AM: Usage of /: 71.4% of 217.51GB
4:30:09 AM: Memory usage: 18%
4:30:09 AM: Swap usage: 1%
4:30:09 AM: Processes: 635
4:30:09 AM: Users logged in: 1
4:30:09 AM: IPv4 address for bond0: 147.75.199.15
4:30:09 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:30:09 AM: IPv4 address for docker0: 172.17.0.1
4:30:09 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:30:09 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:30:09 AM: 85 updates can be applied immediately.
4:30:09 AM: 10 of these updates are standard security updates.
4:30:09 AM: To see these additional updates run: apt list --upgradable
4:30:09 AM: New release '22.04.5 LTS' available.
4:30:09 AM: Run 'do-release-upgrade' to upgrade to it.
4:30:09 AM: 2 updates could not be installed automatically. For more details,
4:30:09 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:30:09 AM: *** System restart required ***
4:30:10 AM: focal: Pulling from netlify/build
4:30:10 AM: Digest: sha256:b6bfc93734dd91a2e188135d2be3256c341898af93899b5497f2260fdcf6b6b2
4:30:10 AM: Status: Image is up to date for netlify/build:focal
4:30:10 AM: docker.io/netlify/build:focal
4:30:10 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:30:10 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:30:10 AM: * Documentation: https://help.ubuntu.com
4:30:10 AM: * Management: https://landscape.canonical.com
4:30:10 AM: * Support: https://ubuntu.com/advantage
4:30:10 AM: System information as of Fri Feb 7 04:30:09 UTC 2025
4:30:10 AM: System load: 0.63
4:30:10 AM: Usage of /: 71.4% of 217.51GB
4:30:10 AM: Memory usage: 18%
4:30:10 AM: Swap usage: 1%
4:30:10 AM: Processes: 635
4:30:10 AM: Users logged in: 1
4:30:10 AM: IPv4 address for bond0: 147.75.199.15
4:30:10 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:30:10 AM: IPv4 address for docker0: 172.17.0.1
4:30:10 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:30:10 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:30:10 AM: 85 updates can be applied immediately.
4:30:10 AM: 10 of these updates are standard security updates.
4:30:10 AM: To see these additional updates run: apt list --upgradable
4:30:10 AM: New release '22.04.5 LTS' available.
4:30:10 AM: Run 'do-release-upgrade' to upgrade to it.
4:30:10 AM: 2 updates could not be installed automatically. For more details,
4:30:10 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:30:10 AM: *** System restart required ***
4:30:10 AM: focal: Pulling from netlify/build
4:30:10 AM: Digest: sha256:b6bfc93734dd91a2e188135d2be3256c341898af93899b5497f2260fdcf6b6b2
4:30:10 AM: Status: Image is up to date for netlify/build:focal
4:30:10 AM: docker.io/netlify/build:focal
4:30:12 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:30:13 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:30:13 AM: * Documentation: https://help.ubuntu.com
4:30:13 AM: * Management: https://landscape.canonical.com
4:30:13 AM: * Support: https://ubuntu.com/advantage
4:30:13 AM: System information as of Fri Feb 7 04:30:13 UTC 2025
4:30:13 AM: System load: 0.58
4:30:13 AM: Usage of /: 71.4% of 217.51GB
4:30:13 AM: Memory usage: 18%
4:30:13 AM: Swap usage: 1%
4:30:13 AM: Processes: 643
4:30:13 AM: Users logged in: 1
4:30:13 AM: IPv4 address for bond0: 147.75.199.15
4:30:13 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:30:13 AM: IPv4 address for docker0: 172.17.0.1
4:30:13 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:30:13 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:30:13 AM: 85 updates can be applied immediately.
4:30:13 AM: 10 of these updates are standard security updates.
4:30:13 AM: To see these additional updates run: apt list --upgradable
4:30:13 AM: New release '22.04.5 LTS' available.
4:30:13 AM: Run 'do-release-upgrade' to upgrade to it.
4:30:13 AM: 2 updates could not be installed automatically. For more details,
4:30:13 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:30:13 AM: *** System restart required ***
4:30:13 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:30:13 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:30:13 AM: * Documentation: https://help.ubuntu.com
4:30:13 AM: * Management: https://landscape.canonical.com
4:30:13 AM: * Support: https://ubuntu.com/advantage
4:30:13 AM: System information as of Fri Feb 7 04:30:13 UTC 2025
4:30:13 AM: System load: 0.58
4:30:13 AM: Usage of /: 71.4% of 217.51GB
4:30:13 AM: Memory usage: 18%
4:30:13 AM: Swap usage: 1%
4:30:13 AM: Processes: 643
4:30:13 AM: Users logged in: 1
4:30:13 AM: IPv4 address for bond0: 147.75.199.15
4:30:13 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:30:13 AM: IPv4 address for docker0: 172.17.0.1
4:30:13 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:30:13 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:30:13 AM: 85 updates can be applied immediately.
4:30:13 AM: 10 of these updates are standard security updates.
4:30:13 AM: To see these additional updates run: apt list --upgradable
4:30:13 AM: New release '22.04.5 LTS' available.
4:30:13 AM: Run 'do-release-upgrade' to upgrade to it.
4:30:13 AM: 2 updates could not be installed automatically. For more details,
4:30:13 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:30:13 AM: *** System restart required ***
4:30:13 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:30:14 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:30:14 AM: * Documentation: https://help.ubuntu.com
4:30:14 AM: * Management: https://landscape.canonical.com
4:30:14 AM: * Support: https://ubuntu.com/advantage
4:30:14 AM: System information as of Fri Feb 7 04:30:13 UTC 2025
4:30:14 AM: System load: 0.58
4:30:14 AM: Usage of /: 71.4% of 217.51GB
4:30:14 AM: Memory usage: 18%
4:30:14 AM: Swap usage: 1%
4:30:14 AM: Processes: 644
4:30:14 AM: Users logged in: 1
4:30:14 AM: IPv4 address for bond0: 147.75.199.15
4:30:14 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:30:14 AM: IPv4 address for docker0: 172.17.0.1
4:30:14 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:30:14 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:30:14 AM: 85 updates can be applied immediately.
4:30:14 AM: 10 of these updates are standard security updates.
4:30:14 AM: To see these additional updates run: apt list --upgradable
4:30:14 AM: New release '22.04.5 LTS' available.
4:30:14 AM: Run 'do-release-upgrade' to upgrade to it.
4:30:14 AM: 2 updates could not be installed automatically. For more details,
4:30:14 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:30:14 AM: *** System restart required ***
4:30:15 AM: /opt/buildhome/.nvm/nvm.sh
4:30:15 AM: .:
4:30:15 AM: ADOPTERS.md jest.config.js netlify.toml tools
4:30:15 AM: bin landscapes_dev package.json update_server
4:30:15 AM: build.sh landscapes.sh postcss.config.js _yarn.lock
4:30:15 AM: code-of-conduct.md landscapes.yml README.md yarn.lock
4:30:15 AM: files LICENSE server.js
4:30:15 AM: _headers netlify specs
4:30:15 AM: INSTALL.md netlify.md src
4:30:15 AM: v18.3
4:30:15 AM: Downloading and installing node v18.3.0...
4:30:15 AM: Computing checksum with sha256sum
4:30:16 AM: Checksums matched!
4:30:18 AM: Now using node v18.3.0 (npm v8.11.0)
4:30:19 AM: Now using node v18.3.0 (npm v8.11.0)
4:30:19 AM: npm
4:30:19 AM: WARN config
4:30:19 AM: global `--global`, `--local` are deprecated. Use `--location=global` instead.
4:30:19 AM:
4:30:20 AM: added 3 packages, and audited 4 packages in 520ms
4:30:20 AM: found 0 vulnerabilities
4:30:20 AM: npm WARN config global `--global`, `--local` are deprecated. Use `--location=global` instead.
4:30:20 AM:
4:30:22 AM: removed 12 packages, changed 105 packages, and audited 263 packages in 2s
4:30:22 AM: 27 packages are looking for funding
4:30:22 AM: run `npm fund` for details
4:30:22 AM: found 0 vulnerabilities
4:30:24 AM: added 1 package in 1s
4:30:24 AM: YN0000: ┌ Resolution step
4:30:25 AM: YN0000: └ Completed
4:30:25 AM: YN0000: ┌ Fetch step
4:30:30 AM: YN0013: │ 2 packages were already cached, 808 had to be fetched
4:30:30 AM: YN0000: └ Completed in 5s 617ms
4:30:30 AM: YN0000: ┌ Link step
4:30:31 AM: YN0000: │ ESM support for PnP uses the experimental loader API and is therefore experimental
4:30:32 AM: YN0007: │ puppeteer@npm:14.2.1 must be built because it never has been before or the last one failed
4:30:32 AM: YN0007: │ puppeteer@npm:13.2.0 must be built because it never has been before or the last one failed
4:30:32 AM: YN0007: │ yarn@npm:1.22.18 must be built because it never has been before or the last one failed
4:30:33 AM: YN0000: │ puppeteer@npm:14.2.1 STDERR
4:30:33 AM: YN0000: │ puppeteer@npm:13.2.0 STDERR
4:30:37 AM: YN0000: │ puppeteer@npm:14.2.1 STDOUT Chromium (1002410) downloaded to /opt/repo/packageRemote/.yarn/unplugged/puppeteer-npm-14.2.1-e2757bbf6c/node_modules/puppeteer/.local-chromium/linux-1002410
4:30:37 AM: YN0000: │ puppeteer@npm:13.2.0 STDOUT Chromium (961656) downloaded to /opt/repo/packageRemote/.yarn/unplugged/puppeteer-npm-13.2.0-5595d43df8/node_modules/puppeteer/.local-chromium/linux-961656
4:30:37 AM: YN0000: └ Completed in 6s 657ms
4:30:37 AM: YN0000: Done with warnings in 12s 692ms
4:30:39 AM: Processing the tree
4:30:42 AM: Running with a level=easy. Settings:
4:30:42 AM: Use cached crunchbase data: true
4:30:42 AM: Use cached images data: true
4:30:42 AM: Use cached twitter data: true
4:30:42 AM: Use cached github basic stats: true
4:30:42 AM: Use cached github start dates: true
4:30:42 AM: Use cached best practices: true
4:30:42 AM: Fetching crunchbase entries
4:30:43 AM: ................................................................................
4:30:43 AM: ................................................................................
4:30:43 AM: ................................................................................
4:30:43 AM: ................................................................................
4:30:43 AM: ................................................................................
4:30:43 AM: .......................................................**
4:30:43 AM: Fetching github entries
4:30:51 AM: ................................................................................
4:30:51 AM: ................................................................................
4:30:51 AM: ..................................*********************.........................
4:30:51 AM: ................................................................................
4:30:51 AM: ....................................................................*********
4:30:51 AM: Fetching start date entries
4:30:54 AM: ................................................................................
4:30:54 AM: ................................................................................
4:30:54 AM: ............................................***********.........................
4:30:54 AM: ................................................................................
4:30:54 AM: ..........................................................*******************
4:30:54 AM: Fetching images
4:30:54 AM: got image entries
4:30:54 AM: Hash for Prefect is prefect-2
4:31:01 AM: ................................................................................
4:31:01 AM: .....**.......**................................................................
4:31:01 AM: ................................................................................
4:31:01 AM: ................................................................................
4:31:01 AM: ................................................................................
4:31:01 AM: ................................................................................
4:31:01 AM: ...
4:31:01 AM: Fetching last tweet dates
4:31:01 AM: Fetching best practices
4:31:02 AM: ................................................................................
4:31:02 AM: ................................................................................
4:31:02 AM: ................................................................................
4:31:02 AM: ................................................................................
4:31:02 AM: ................................................
4:31:02 AM: Fetching CLOMonitor data
4:31:02 AM: Processing the tree
4:31:02 AM: saving!
4:31:04 AM: Hash for Prefect is prefect-2
4:31:04 AM: Warning: Open Platform for Enterprise AI (OPEA) has a twitter https://twitter.com/opeadev which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: Dragonfly has a twitter https://twitter.com/dragonfly_oss which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: Vald has a twitter https://twitter.com/vdaas_vald which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: FastTrackML has a twitter https://twitter.com/oss_gr which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: Mathesar has a twitter https://twitter.com/centerofci which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: RWKV has a twitter https://twitter.com/RWKV_AI which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: Langchain has a twitter https://twitter.com/LangChainAI which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: Haystack has a twitter https://twitter.com/deepset_ai which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: Armada has a twitter https://twitter.com/oss_gr which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: envd has a twitter https://twitter.com/TensorChord which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: NVIDIA Corporation has a twitter https://twitter.com/nvidia which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: Advanced Micro Devices (AMD) has a twitter https://twitter.com/AMD which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: Alibaba Cloud (Singapore) Private LTD has a twitter https://twitter.com/AlibabaB2B which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: Fujitsu Limited has a twitter https://twitter.com/Fujitsu_Global which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: GitCode has a twitter https://twitter.com/turingbook which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: Neo4j, Inc. has a twitter https://twitter.com/neo4j which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: New Relic, Inc. has a twitter https://twitter.com/newrelic which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: Posit has a twitter https://twitter.com/posit_pbc which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: Snyk Limited has a twitter https://twitter.com/snyksec which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: AI for People has a twitter https://twitter.com/AIforPeople which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: Ambianic.ai has a twitter https://twitter.com/ambianicai which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: Columbia University has a twitter https://twitter.com/Columbia which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: Common Crawl Foundation has a twitter https://twitter.com/commoncrawl which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: Ersilia Open Source Initiative has a twitter https://twitter.com/ersiliaio which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: Ewha Institute for Biomedical Law & Ethics has a twitter https://twitter.com/cybercampus which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: Montreal AI Ethics Institute has a twitter https://twitter.com/mtlaiethics which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: NIPA has a twitter https://twitter.com/NIPAkr which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: One Fact Foundation has a twitter https://twitter.com/onefact_org which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: Pranveer Singh Institute Of Technology has a twitter https://twitter.com/PSITKanpur2004 which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: Rensselaer Polytechnic Institute has a twitter https://twitter.com/rpi which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: Transport for NSW has a twitter https://twitter.com/TransportforNSW which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: University of Michigan has a twitter https://twitter.com/UMich which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: Wikimedia Deutschland e. V. has a twitter https://twitter.com/WikimediaDE which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: World Ethical Data Foundation has a twitter https://twitter.com/WEDF_foundation which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: BasicAI(hosting) has a twitter https://twitter.com/BasicAIteam which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: Fujitsu Limited(hosting) has a twitter https://twitter.com/Fujitsu_Global which is invalid or we just can not fetch its tweets
4:31:04 AM: Warning: Owkin(hosting) has a twitter https://twitter.com/owkinscience which is invalid or we just can not fetch its tweets
4:31:05 AM: Fetching members from LF AI & Data Member Company category
4:31:05 AM: Processing the tree
4:31:05 AM: Assigning Premier membership on DataHub (LinkedIn) because its parent Microsoft has Premier membership
4:31:06 AM: Assigning Premier membership on Brooklin (LinkedIn) because its parent Microsoft has Premier membership
4:31:06 AM: Assigning Premier membership on Azkaban (LinkedIn) because its parent Microsoft has Premier membership
4:31:06 AM: Assigning Premier membership on LinkedIn (hosting) (LinkedIn) because its parent Microsoft has Premier membership
4:31:06 AM: Hash for Chaitanya Bharathi Institute of Technology is chaitanya-bharathi-institute-of-technology-2
4:31:06 AM: Hash for Fast.ai is fast-ai-2
4:31:06 AM: Hash for Great Expectations is great-expectations-2
4:31:06 AM: Hash for ML Perf is ml-perf-2
4:31:06 AM: Hash for PipelineAI is pipeline-ai-2
4:31:06 AM: Hash for Prefect is prefect-2
4:31:06 AM: Hash for Redash is redash-2
4:31:06 AM: Hash for Wikimedia Deutschland e.V. is wikimedia-deutschland-e-v-2
4:31:08 AM: {
4:31:08 AM: name: 'Accord.NET',
4:31:08 AM: homepage_url: 'http://accord-framework.net/',
4:31:08 AM: logo: 'accord-net.svg',
4:31:08 AM: github_data: {
4:31:08 AM: languages: [
4:31:08 AM: [Object], [Object],
4:31:08 AM: [Object], [Object],
4:31:08 AM: [Object], [Object],
4:31:08 AM: [Object], [Object],
4:31:08 AM: [Object], [Object],
4:31:08 AM: [Object], [Object],
4:31:08 AM: [Object], [Object],
4:31:08 AM: [Object]
4:31:08 AM: ],
4:31:08 AM: contributions: '0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0',
4:31:08 AM: firstWeek: '2022-11-27Z',
4:31:08 AM: stars: 4404,
4:31:08 AM: license: 'GNU Lesser General Public License v2.1',
4:31:08 AM: description: 'Machine learning, computer vision, statistics and general scientific computing for .NET',
4:31:08 AM: latest_commit_date: '2020-11-18T19:53:01Z',
4:31:08 AM: latest_commit_link: '/accord-net/framework/commit/1ab0cc0ba55bcc3d46f20e7bbe7224b58cd01854',
4:31:08 AM: release_date: '2017-10-19T21:00:56Z',
4:31:08 AM: contributors_count: 98,
4:31:08 AM: },
4:31:08 AM: repos: [ { url: 'https://github.com/accord-net/framework', stars: 4404 } ],
4:31:08 AM: github_start_commit_data: {
4:31:08 AM: start_commit_link: '/accord-net/framework/commit/5e834efc89c071bb32c84f41124d5c9a5bbfe1e3',
4:31:08 AM: start_date: '2012-04-08T14:05:58Z'
4:31:08 AM: },
4:31:08 AM: image_data: {
4:31:08 AM: fileName: 'accord-net.svg',
4:31:08 AM: hash: 'tdvFWfo6hfZQqHqjscAtcoZWk6qztZFzBTVqKWUjYig='
4:31:08 AM: },
4:31:08 AM: firstCommitDate: '2012-04-08T14:05:58Z',
4:31:08 AM: latestCommitDate: '2020-11-18T19:53:01Z',
4:31:08 AM: releaseDate: '2017-10-19T21:00:56Z',
4:31:08 AM: commitsThisYear: 0,
4:31:08 AM: contributorsCount: 98,
4:31:08 AM: language: 'C#',
4:31:08 AM: stars: 4404,
4:31:08 AM: license: 'GNU Lesser General Public License v2.1',
4:31:08 AM: headquarters: 'Grenoble, France',
4:31:08 AM: description: 'Machine learning, computer vision, statistics and general scientific computing for .NET',
4:31:08 AM: organization: 'Accord.NET Framework',
4:31:08 AM: crunchbaseData: {
4:31:08 AM: name: 'Accord.NET Framework',
4:31:08 AM: description: 'Machine Learning Framework',
4:31:08 AM: homepage: 'http://accord-framework.net/',
4:31:08 AM: city: 'Grenoble',
4:31:08 AM: region: 'Rhone-Alpes',
4:31:08 AM: country: 'France',
4:31:08 AM: twitter: null,
4:31:08 AM: linkedin: null,
4:31:08 AM: acquisitions: [],
4:31:08 AM: parents: [],
4:31:08 AM: stockExchange: null,
4:31:08 AM: company_type: 'Non Profit',
4:31:08 AM: industries: [
4:31:08 AM: 'Analytics',
4:31:08 AM: 'Artificial Intelligence',
4:31:08 AM: 'Hardware',
4:31:08 AM: 'Machine Learning'
4:31:08 AM: ],
4:31:08 AM: numEmployeesMin: null,
4:31:08 AM: numEmployeesMax: null
4:31:08 AM: },
4:31:08 AM: path: 'Machine Learning / Framework',
4:31:08 AM: landscape: 'Machine Learning / Framework',
4:31:08 AM: category: 'Machine Learning',
4:31:08 AM: amount: 'N/A',
4:31:08 AM: oss: true,
4:31:08 AM: href: 'logos/accord-net.svg',
4:31:08 AM: bestPracticeBadgeId: false,
4:31:08 AM: bestPracticePercentage: null,
4:31:08 AM: industries: [
4:31:08 AM: 'Analytics',
4:31:08 AM: 'Artificial Intelligence',
4:31:08 AM: 'Hardware',
4:31:08 AM: 'Machine Learning'
4:31:08 AM: ],
4:31:08 AM: starsPresent: true,
4:31:08 AM: starsAsText: '4,404',
4:31:08 AM: marketCapPresent: false,
4:31:08 AM: marketCapAsText: 'N/A',
4:31:08 AM: id: 'accord-net',
4:31:08 AM: flatName: 'Accord.NET',
4:31:08 AM: member: false,
4:31:08 AM: relation: false,
4:31:08 AM: isSubsidiaryProject: false
4:31:08 AM: } 2020-11-18T19:53:01Z
4:31:08 AM: [
4:31:08 AM: 'Community Data License Agreement (CDLA)',
4:31:08 AM: 'PlaNet',
4:31:08 AM: 'Generic Neural Elastic Search (GNES)',
4:31:08 AM: 'PredictionIO',
4:31:08 AM: 'ELI5',
4:31:08 AM: 'BERT',
4:31:08 AM: 'Nauta',
4:31:08 AM: 'DAWNBench',
4:31:08 AM: 'AresDB',
4:31:08 AM: 'dotmesh',
4:31:08 AM: 'Audit AI',
4:31:08 AM: 'euler',
4:31:08 AM: 'Clipper',
4:31:08 AM: 'Accord.NET',
4:31:08 AM: 'Shogun',
4:31:08 AM: 'DELTA',
4:31:08 AM: 'BeakerX',
4:31:08 AM: 'PixieDust',
4:31:08 AM: 'TreeInterpreter',
4:31:08 AM: 'Cyclone',
4:31:08 AM: 'Lucid',
4:31:08 AM: 'XLM',
4:31:08 AM: 'Chainer RL',
4:31:08 AM: 'ForestFlow',
4:31:08 AM: 'uReplicator',
4:31:08 AM: 'Elastic Deep Learning (EDL)',
4:31:08 AM: 'Kashgari',
4:31:08 AM: 'DataPractices',
4:31:08 AM: 'X-DeepLearning',
4:31:08 AM: 'LIME',
4:31:08 AM: 'Model Asset eXchange (MAX)',
4:31:08 AM: 'TransmogrifAI',
4:31:08 AM: 'OpenBytes',
4:31:08 AM: 'DeepLIFT',
4:31:08 AM: 'Onepanel',
4:31:08 AM: 'DeepSpeech',
4:31:08 AM: 'Lucene',
4:31:08 AM: 'Turi Create',
4:31:08 AM: 'Visual Object Tagging Tool (VoTT)',
4:31:08 AM: 'Acumos',
4:31:08 AM: 'Skater',
4:31:08 AM: 'Catalyst',
4:31:08 AM: 'SKIP Language',
4:31:08 AM: 'SQLFlow',
4:31:08 AM: 'Advertorch',
4:31:08 AM: 'xLearn',
4:31:08 AM: 'Neuropod',
4:31:08 AM: 'AdvBox',
4:31:08 AM: 'RCloud',
4:31:08 AM: 'Neo-AI',
4:31:08 AM: 'Embedded Learning Library',
4:31:08 AM: 'Stable Baselines',
4:31:08 AM: 'talos',
4:31:08 AM: 'LabelImg',
4:31:08 AM: 'MMdnn',
4:31:08 AM: 'CNTK',
4:31:08 AM: 'Machine Learning eXchange',
4:31:08 AM: 'Singularity',
4:31:08 AM: 'Chainer',
4:31:08 AM: 'PyText',
4:31:08 AM: 'Pipeline.ai',
4:31:08 AM: 'Apache Bahir',
4:31:08 AM: 'NLP Architect',
4:31:08 AM: 'AllenNLP',
4:31:08 AM: 'Angel-ML',
4:31:08 AM: 'SEED RL',
4:31:08 AM: 'Coach',
4:31:08 AM: 'Gluon-NLP',
4:31:08 AM: 'DeepMind Lab',
4:31:08 AM: 'SEAL',
4:31:08 AM: 'MXNet',
4:31:08 AM: 'OpenAI Gym',
4:31:08 AM: 'MindMeld',
4:31:08 AM: 'CleverHans',
4:31:08 AM: 'Petastorm',
4:31:08 AM: 'Hawq',
4:31:08 AM: 'TF Encrypted',
4:31:08 AM: 'faust',
4:31:08 AM: 'Cortex',
4:31:08 AM: 'OpenDataology',
4:31:08 AM: 'YouTokenToMe',
4:31:08 AM: 'ALBERT',
4:31:08 AM: 'Adlik',
4:31:08 AM: '1chipML',
4:31:08 AM: 'Neural Network Distiller',
4:31:08 AM: 'Labelbox',
4:31:08 AM: 'Facets',
4:31:08 AM: 'OpenNN',
4:31:08 AM: 'Pilosa',
4:31:08 AM: 'Orchest',
4:31:08 AM: 'Model Server for Apache MXNet',
4:31:08 AM: 'LASER',
4:31:08 AM: 'Dopamine',
4:31:08 AM: 'MindSpore',
4:31:08 AM: 'HE Lib',
4:31:08 AM: 'd6tflow',
4:31:08 AM: 'Sonnet',
4:31:08 AM: 'Plaid ML',
4:31:08 AM: 'Nyoka',
4:31:08 AM: 'doccano',
4:31:08 AM: ... 253 more items
4:31:08 AM: ]
4:31:12 AM: ncc: Version 0.34.0
4:31:12 AM: ncc: Compiling file index.js into CJS
4:31:14 AM: ncc: Version 0.34.0
4:31:14 AM: ncc: Compiling file index.js into CJS
4:31:14 AM: ncc: Version 0.34.0
4:31:14 AM: ncc: Compiling file index.js into CJS
4:31:18 AM: Development server running at http://127.0.0.1:4000/
4:31:28 AM: Running integration-test,check-landscape,render-landscape,funding in parallel
4:31:30 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:31:32 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:31:33 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:31:34 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:31:36 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:31:36 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:31:38 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:31:38 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:31:39 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:31:39 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:31:40 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:31:40 AM: api request starting... /api/ids?category=&project=&license=&organization=accord-net-framework&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:31:40 AM: api request starting... /api/ids?category=&project=&license=&organization=microsoft&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:31:51 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:32:11 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:32:31 AM: Task: integration-test PASS specs/main.spec.js (9.897s)
4:32:31 AM: Main test
4:32:31 AM: I visit a main page and have all required elements
4:32:31 AM: ✓ I can open a page (1609ms)
4:32:31 AM: ✓ A proper header is present (5ms)
4:32:31 AM: ✓ Group headers are ok (2ms)
4:32:31 AM: ✓ I see a You are viewing text (1ms)
4:32:31 AM: ✓ A proper card is present (2ms)
4:32:31 AM: ✓ If I click on a card, I see a modal dialog (328ms)
4:32:31 AM: ✓ Closing a browser (29ms)
4:32:31 AM: Landscape Test
4:32:31 AM: ✓ I visit LF AI & Data Members landscape page and have all required elements, elements are clickable (1208ms)
4:32:31 AM: ✓ Closing a browser (22ms)
4:32:31 AM: ✓ I visit Companies Hosting Projects landscape page and have all required elements, elements are clickable (659ms)
4:32:31 AM: ✓ Closing a browser (20ms)
4:32:31 AM: I visit a main landscape page and have all required elements
4:32:31 AM: ✓ I open a landscape page and wait for it to load (1960ms)
4:32:31 AM: ✓ When I click on an item the modal is open (164ms)
4:32:31 AM: ✓ If I would straight open the url with a selected id, a modal appears (1950ms)
4:32:31 AM: ✓ Closing a browser (35ms)
4:32:31 AM: Filtering by organization
4:32:31 AM: ✓ Checking we see Accord.NET when filtering by organization Accord.NET Framework (595ms)
4:32:31 AM: ✓ Checking we don't see Accord.NET when filtering by organization Microsoft (226ms)
✓ Closing a browser (24ms)
4:32:31 AM: PASS specs/tools/actualTwitter.spec.js
4:32:31 AM: Twitter URL
4:32:31 AM: when crunchbase data not set
4:32:31 AM: ✓ returns URL from node (2ms)
4:32:31 AM: when node does not have twitter URL
4:32:31 AM: ✓ returns URL from node
4:32:31 AM: when node has twitter URL set to null
4:32:31 AM: ✓ returns undefined
4:32:31 AM: when both node and crunchbase have twitter URL
4:32:31 AM: ✓ returns URL from node
4:32:31 AM: when twitter URL is not set anywhere
4:32:31 AM: ✓ returns undefined (1ms)
4:32:31 AM: cleaning up twitter URL
4:32:31 AM: ✓ replaces http with https
4:32:31 AM: ✓ removes www
4:32:31 AM: ✓ query string
4:32:31 AM: Test Suites: 2 passed, 2 total
4:32:31 AM: Tests: 26 passed, 26 total
4:32:31 AM: Snapshots: 0 total
4:32:31 AM: Time: 10.131s
4:32:31 AM: Task: check-landscape
4:32:31 AM: Task: render-landscape visiting http://localhost:4000/fullscreen?version=2025-02-07T04:31:29Z d651106&scale=false&pdf
4:32:31 AM: visiting http://localhost:4000/fullscreen?version=2025-02-07T04:31:29Z d651106&scale=false&pdf
4:32:31 AM: Task: funding From https://github.com/lfai/lfai-landscape
4:32:31 AM: * [new branch] NSouthernLF-patch-1 -> github/NSouthernLF-patch-1
4:32:31 AM: * [new branch] NSouthernLF-patch-2 -> github/NSouthernLF-patch-2
4:32:31 AM: * [new branch] NSouthernLF-patch-3 -> github/NSouthernLF-patch-3
4:32:31 AM: * [new branch] NSouthernLF-patch-3-1 -> github/NSouthernLF-patch-3-1
4:32:31 AM: * [new branch] create-pull-request/patch-1693628094 -> github/create-pull-request/patch-1693628094
4:32:31 AM: * [new branch] create-pull-request/patch-1693714471 -> github/create-pull-request/patch-1693714471
4:32:31 AM: * [new branch] create-pull-request/patch-1703668428 -> github/create-pull-request/patch-1703668428
4:32:31 AM: * [new branch] main -> github/main
4:32:31 AM: * [new branch] revert-303-main -> github/revert-303-main
4:32:31 AM: * [new branch] web-landscape-2021-10-01T22-18 -> github/web-landscape-2021-10-01T22-18
4:32:31 AM: * [new branch] web-landscape-2021-10-12T16-06 -> github/web-landscape-2021-10-12T16-06
4:32:31 AM: * [new branch] web-landscape-2021-10-18T19-46 -> github/web-landscape-2021-10-18T19-46
4:32:31 AM: * [new branch] web-landscape-2021-10-20T18-58 -> github/web-landscape-2021-10-20T18-58
4:32:34 AM: Output from remote build, exit code: 0
4:32:34 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:32:34 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:32:34 AM: * Documentation: https://help.ubuntu.com
4:32:34 AM: * Management: https://landscape.canonical.com
4:32:34 AM: * Support: https://ubuntu.com/advantage
4:32:34 AM: System information as of Fri Feb 7 04:30:13 UTC 2025
4:32:34 AM: System load: 0.58
4:32:34 AM: Usage of /: 71.4% of 217.51GB
4:32:34 AM: Memory usage: 18%
4:32:34 AM: Swap usage: 1%
4:32:34 AM: Processes: 644
4:32:34 AM: Users logged in: 1
4:32:34 AM: IPv4 address for bond0: 147.75.199.15
4:32:34 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:32:34 AM: IPv4 address for docker0: 172.17.0.1
4:32:34 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:32:34 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:32:34 AM: 85 updates can be applied immediately.
4:32:34 AM: 10 of these updates are standard security updates.
4:32:34 AM: To see these additional updates run: apt list --upgradable
4:32:34 AM: New release '22.04.5 LTS' available.
4:32:34 AM: Run 'do-release-upgrade' to upgrade to it.
4:32:34 AM: 2 updates could not be installed automatically. For more details,
4:32:34 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:32:34 AM: *** System restart required ***
4:32:34 AM: /opt/buildhome/.nvm/nvm.sh
4:32:34 AM: .:
4:32:34 AM: ADOPTERS.md jest.config.js netlify.toml tools
4:32:34 AM: bin landscapes_dev package.json update_server
4:32:34 AM: build.sh landscapes.sh postcss.config.js _yarn.lock
4:32:34 AM: code-of-conduct.md landscapes.yml README.md yarn.lock
4:32:34 AM: files LICENSE server.js
4:32:34 AM: _headers netlify specs
4:32:34 AM: INSTALL.md netlify.md src
4:32:34 AM: v18.3
4:32:34 AM: Downloading and installing node v18.3.0...
4:32:34 AM: Computing checksum with sha256sum
4:32:34 AM: Checksums matched!
4:32:34 AM: Now using node v18.3.0 (npm v8.11.0)
4:32:34 AM: Now using node v18.3.0 (npm v8.11.0)
4:32:34 AM: npm WARN config global `--global`, `--local` are deprecated. Use `--location=global` instead.
4:32:34 AM:
4:32:34 AM: added 3 packages, and audited 4 packages in 520ms
4:32:34 AM: found 0 vulnerabilities
4:32:34 AM: npm WARN config global `--global`, `--local` are deprecated. Use `--location=global` instead.
4:32:34 AM:
4:32:34 AM: removed 12 packages, changed 105 packages, and audited 263 packages in 2s
4:32:34 AM: 27 packages are looking for funding
4:32:34 AM: run `npm fund` for details
4:32:34 AM: found 0 vulnerabilities
4:32:34 AM: added 1 package in 1s
4:32:34 AM: YN0000: ┌ Resolution step
4:32:34 AM: YN0000: └ Completed
4:32:34 AM: YN0000: ┌ Fetch step
4:32:34 AM: YN0013: │ 2 packages were already cached, 808 had to be fetched
4:32:34 AM: YN0000: └ Completed in 5s 617ms
4:32:34 AM: YN0000: ┌ Link step
4:32:34 AM: YN0000: │ ESM support for PnP uses the experimental loader API and is therefore experimental
4:32:34 AM: YN0007: │ puppeteer@npm:14.2.1 must be built because it never has been before or the last one failed
4:32:34 AM: YN0007: │ puppeteer@npm:13.2.0 must be built because it never has been before or the last one failed
4:32:34 AM: YN0007: │ yarn@npm:1.22.18 must be built because it never has been before or the last one failed
4:32:34 AM: YN0000: │ puppeteer@npm:14.2.1 STDERR
4:32:34 AM: YN0000: │ puppeteer@npm:13.2.0 STDERR
4:32:34 AM: YN0000: │ puppeteer@npm:14.2.1 STDOUT Chromium (1002410) downloaded to /opt/repo/packageRemote/.yarn/unplugged/puppeteer-npm-14.2.1-e2757bbf6c/node_modules/puppeteer/.local-chromium/linux-1002410
4:32:34 AM: YN0000: │ puppeteer@npm:13.2.0 STDOUT Chromium (961656) downloaded to /opt/repo/packageRemote/.yarn/unplugged/puppeteer-npm-13.2.0-5595d43df8/node_modules/puppeteer/.local-chromium/linux-961656
4:32:34 AM: YN0000: └ Completed in 6s 657ms
4:32:34 AM: YN0000: Done with warnings in 12s 692ms
4:32:34 AM: Processing the tree
4:32:34 AM: Running with a level=easy. Settings:
4:32:34 AM: Use cached crunchbase data: true
4:32:34 AM: Use cached images data: true
4:32:34 AM: Use cached twitter data: true
4:32:34 AM: Use cached github basic stats: true
4:32:34 AM: Use cached github start dates: true
4:32:34 AM: Use cached best practices: true
4:32:34 AM: Fetching crunchbase entries
4:32:34 AM: ................................................................................
4:32:34 AM: ................................................................................
4:32:34 AM: ................................................................................
4:32:34 AM: ................................................................................
4:32:34 AM: ................................................................................
4:32:34 AM: .......................................................**
4:32:34 AM: Fetching github entries
4:32:34 AM: ................................................................................
4:32:34 AM: ................................................................................
4:32:34 AM: ..................................*********************.........................
4:32:34 AM: ................................................................................
4:32:34 AM: ....................................................................*********
4:32:34 AM: Fetching start date entries
4:32:34 AM: ................................................................................
4:32:34 AM: ................................................................................
4:32:34 AM: ............................................***********.........................
4:32:34 AM: ................................................................................
4:32:34 AM: ..........................................................*******************
4:32:34 AM: Fetching images
4:32:34 AM: got image entries
4:32:34 AM: Hash for Prefect is prefect-2
4:32:34 AM: ................................................................................
4:32:34 AM: .....**.......**................................................................
4:32:34 AM: ................................................................................
4:32:34 AM: ................................................................................
4:32:34 AM: ................................................................................
4:32:34 AM: ................................................................................
4:32:34 AM: ...
4:32:34 AM: Fetching last tweet dates
4:32:34 AM: Fetching best practices
4:32:34 AM: ................................................................................
4:32:34 AM: ................................................................................
4:32:34 AM: ................................................................................
4:32:34 AM: ................................................................................
4:32:34 AM: ................................................
4:32:34 AM: Fetching CLOMonitor data
4:32:34 AM: Processing the tree
4:32:34 AM: saving!
4:32:34 AM: Hash for Prefect is prefect-2
4:32:34 AM: Warning: Open Platform for Enterprise AI (OPEA) has a twitter https://twitter.com/opeadev which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: Dragonfly has a twitter https://twitter.com/dragonfly_oss which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: Vald has a twitter https://twitter.com/vdaas_vald which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: FastTrackML has a twitter https://twitter.com/oss_gr which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: Mathesar has a twitter https://twitter.com/centerofci which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: RWKV has a twitter https://twitter.com/RWKV_AI which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: Langchain has a twitter https://twitter.com/LangChainAI which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: Haystack has a twitter https://twitter.com/deepset_ai which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: Armada has a twitter https://twitter.com/oss_gr which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: envd has a twitter https://twitter.com/TensorChord which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: NVIDIA Corporation has a twitter https://twitter.com/nvidia which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: Advanced Micro Devices (AMD) has a twitter https://twitter.com/AMD which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: Alibaba Cloud (Singapore) Private LTD has a twitter https://twitter.com/AlibabaB2B which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: Fujitsu Limited has a twitter https://twitter.com/Fujitsu_Global which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: GitCode has a twitter https://twitter.com/turingbook which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: Neo4j, Inc. has a twitter https://twitter.com/neo4j which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: New Relic, Inc. has a twitter https://twitter.com/newrelic which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: Posit has a twitter https://twitter.com/posit_pbc which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: Snyk Limited has a twitter https://twitter.com/snyksec which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: AI for People has a twitter https://twitter.com/AIforPeople which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: Ambianic.ai has a twitter https://twitter.com/ambianicai which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: Columbia University has a twitter https://twitter.com/Columbia which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: Common Crawl Foundation has a twitter https://twitter.com/commoncrawl which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: Ersilia Open Source Initiative has a twitter https://twitter.com/ersiliaio which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: Ewha Institute for Biomedical Law & Ethics has a twitter https://twitter.com/cybercampus which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: Montreal AI Ethics Institute has a twitter https://twitter.com/mtlaiethics which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: NIPA has a twitter https://twitter.com/NIPAkr which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: One Fact Foundation has a twitter https://twitter.com/onefact_org which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: Pranveer Singh Institute Of Technology has a twitter https://twitter.com/PSITKanpur2004 which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: Rensselaer Polytechnic Institute has a twitter https://twitter.com/rpi which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: Transport for NSW has a twitter https://twitter.com/TransportforNSW which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: University of Michigan has a twitter https://twitter.com/UMich which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: Wikimedia Deutschland e. V. has a twitter https://twitter.com/WikimediaDE which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: World Ethical Data Foundation has a twitter https://twitter.com/WEDF_foundation which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: BasicAI(hosting) has a twitter https://twitter.com/BasicAIteam which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: Fujitsu Limited(hosting) has a twitter https://twitter.com/Fujitsu_Global which is invalid or we just can not fetch its tweets
4:32:34 AM: Warning: Owkin(hosting) has a twitter https://twitter.com/owkinscience which is invalid or we just can not fetch its tweets
4:32:34 AM: Fetching members from LF AI & Data Member Company category
4:32:34 AM: Processing the tree
4:32:34 AM: Assigning Premier membership on DataHub (LinkedIn) because its parent Microsoft has Premier membership
4:32:34 AM: Assigning Premier membership on Brooklin (LinkedIn) because its parent Microsoft has Premier membership
4:32:34 AM: Assigning Premier membership on Azkaban (LinkedIn) because its parent Microsoft has Premier membership
4:32:34 AM: Assigning Premier membership on LinkedIn (hosting) (LinkedIn) because its parent Microsoft has Premier membership
4:32:34 AM: Hash for Chaitanya Bharathi Institute of Technology is chaitanya-bharathi-institute-of-technology-2
4:32:34 AM: Hash for Fast.ai is fast-ai-2
4:32:34 AM: Hash for Great Expectations is great-expectations-2
4:32:34 AM: Hash for ML Perf is ml-perf-2
4:32:34 AM: Hash for PipelineAI is pipeline-ai-2
4:32:34 AM: Hash for Prefect is prefect-2
4:32:34 AM: Hash for Redash is redash-2
4:32:34 AM: Hash for Wikimedia Deutschland e.V. is wikimedia-deutschland-e-v-2
4:32:34 AM: {
4:32:34 AM: name: 'Accord.NET',
4:32:34 AM: homepage_url: 'http://accord-framework.net/',
4:32:34 AM: logo: 'accord-net.svg',
4:32:34 AM: github_data: {
4:32:34 AM: languages: [
4:32:34 AM: [Object], [Object],
4:32:34 AM: [Object], [Object],
4:32:34 AM: [Object], [Object],
4:32:34 AM: [Object], [Object],
4:32:34 AM: [Object], [Object],
4:32:34 AM: [Object], [Object],
4:32:34 AM: [Object], [Object],
4:32:34 AM: [Object]
4:32:34 AM: ],
4:32:34 AM: contributions: '0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0',
4:32:34 AM: firstWeek: '2022-11-27Z',
4:32:34 AM: stars: 4404,
4:32:34 AM: license: 'GNU Lesser General Public License v2.1',
4:32:34 AM: description: 'Machine learning, computer vision, statistics and general scientific computing for .NET',
4:32:34 AM: latest_commit_date: '2020-11-18T19:53:01Z',
4:32:34 AM: latest_commit_link: '/accord-net/framework/commit/1ab0cc0ba55bcc3d46f20e7bbe7224b58cd01854',
4:32:34 AM: release_date: '2017-10-19T21:00:56Z',
4:32:34 AM: contributors_count: 98,
4:32:34 AM: },
4:32:34 AM: repos: [ { url: 'https://github.com/accord-net/framework', stars: 4404 } ],
4:32:34 AM: github_start_commit_data: {
4:32:34 AM: start_commit_link: '/accord-net/framework/commit/5e834efc89c071bb32c84f41124d5c9a5bbfe1e3',
4:32:34 AM: start_date: '2012-04-08T14:05:58Z'
4:32:34 AM: },
4:32:34 AM: image_data: {
4:32:34 AM: fileName: 'accord-net.svg',
4:32:34 AM: hash: 'tdvFWfo6hfZQqHqjscAtcoZWk6qztZFzBTVqKWUjYig='
4:32:34 AM: },
4:32:34 AM: firstCommitDate: '2012-04-08T14:05:58Z',
4:32:34 AM: latestCommitDate: '2020-11-18T19:53:01Z',
4:32:34 AM: releaseDate: '2017-10-19T21:00:56Z',
4:32:34 AM: commitsThisYear: 0,
4:32:34 AM: contributorsCount: 98,
4:32:34 AM: language: 'C#',
4:32:34 AM: stars: 4404,
4:32:34 AM: license: 'GNU Lesser General Public License v2.1',
4:32:34 AM: headquarters: 'Grenoble, France',
4:32:34 AM: description: 'Machine learning, computer vision, statistics and general scientific computing for .NET',
4:32:34 AM: organization: 'Accord.NET Framework',
4:32:34 AM: crunchbaseData: {
4:32:34 AM: name: 'Accord.NET Framework',
4:32:34 AM: description: 'Machine Learning Framework',
4:32:34 AM: homepage: 'http://accord-framework.net/',
4:32:34 AM: city: 'Grenoble',
4:32:34 AM: region: 'Rhone-Alpes',
4:32:34 AM: country: 'France',
4:32:34 AM: twitter: null,
4:32:34 AM: linkedin: null,
4:32:34 AM: acquisitions: [],
4:32:34 AM: parents: [],
4:32:34 AM: stockExchange: null,
4:32:34 AM: company_type: 'Non Profit',
4:32:34 AM: industries: [
4:32:34 AM: 'Analytics',
4:32:34 AM: 'Artificial Intelligence',
4:32:34 AM: 'Hardware',
4:32:34 AM: 'Machine Learning'
4:32:34 AM: ],
4:32:34 AM: numEmployeesMin: null,
4:32:34 AM: numEmployeesMax: null
4:32:34 AM: },
4:32:34 AM: path: 'Machine Learning / Framework',
4:32:34 AM: landscape: 'Machine Learning / Framework',
4:32:34 AM: category: 'Machine Learning',
4:32:34 AM: amount: 'N/A',
4:32:34 AM: oss: true,
4:32:34 AM: href: 'logos/accord-net.svg',
4:32:34 AM: bestPracticeBadgeId: false,
4:32:34 AM: bestPracticePercentage: null,
4:32:34 AM: industries: [
4:32:34 AM: 'Analytics',
4:32:34 AM: 'Artificial Intelligence',
4:32:34 AM: 'Hardware',
4:32:34 AM: 'Machine Learning'
4:32:34 AM: ],
4:32:34 AM: starsPresent: true,
4:32:34 AM: starsAsText: '4,404',
4:32:34 AM: marketCapPresent: false,
4:32:34 AM: marketCapAsText: 'N/A',
4:32:34 AM: id: 'accord-net',
4:32:34 AM: flatName: 'Accord.NET',
4:32:34 AM: member: false,
4:32:34 AM: relation: false,
4:32:34 AM: isSubsidiaryProject: false
4:32:34 AM: } 2020-11-18T19:53:01Z
4:32:34 AM: [
4:32:34 AM: 'Community Data License Agreement (CDLA)',
4:32:34 AM: 'PlaNet',
4:32:34 AM: 'Generic Neural Elastic Search (GNES)',
4:32:34 AM: 'PredictionIO',
4:32:34 AM: 'ELI5',
4:32:34 AM: 'BERT',
4:32:34 AM: 'Nauta',
4:32:34 AM: 'DAWNBench',
4:32:34 AM: 'AresDB',
4:32:34 AM: 'dotmesh',
4:32:34 AM: 'Audit AI',
4:32:34 AM: 'euler',
4:32:34 AM: 'Clipper',
4:32:34 AM: 'Accord.NET',
4:32:34 AM: 'Shogun',
4:32:34 AM: 'DELTA',
4:32:34 AM: 'BeakerX',
4:32:34 AM: 'PixieDust',
4:32:34 AM: 'TreeInterpreter',
4:32:34 AM: 'Cyclone',
4:32:34 AM: 'Lucid',
4:32:34 AM: 'XLM',
4:32:34 AM: 'Chainer RL',
4:32:34 AM: 'ForestFlow',
4:32:34 AM: 'uReplicator',
4:32:34 AM: 'Elastic Deep Learning (EDL)',
4:32:34 AM: 'Kashgari',
4:32:34 AM: 'DataPractices',
4:32:34 AM: 'X-DeepLearning',
4:32:34 AM: 'LIME',
4:32:34 AM: 'Model Asset eXchange (MAX)',
4:32:34 AM: 'TransmogrifAI',
4:32:34 AM: 'OpenBytes',
4:32:34 AM: 'DeepLIFT',
4:32:34 AM: 'Onepanel',
4:32:34 AM: 'DeepSpeech',
4:32:34 AM: 'Lucene',
4:32:34 AM: 'Turi Create',
4:32:34 AM: 'Visual Object Tagging Tool (VoTT)',
4:32:34 AM: 'Acumos',
4:32:34 AM: 'Skater',
4:32:34 AM: 'Catalyst',
4:32:34 AM: 'SKIP Language',
4:32:34 AM: 'SQLFlow',
4:32:34 AM: 'Advertorch',
4:32:34 AM: 'xLearn',
4:32:34 AM: 'Neuropod',
4:32:34 AM: 'AdvBox',
4:32:34 AM: 'RCloud',
4:32:34 AM: 'Neo-AI',
4:32:34 AM: 'Embedded Learning Library',
4:32:34 AM: 'Stable Baselines',
4:32:34 AM: 'talos',
4:32:34 AM: 'LabelImg',
4:32:34 AM: 'MMdnn',
4:32:34 AM: 'CNTK',
4:32:34 AM: 'Machine Learning eXchange',
4:32:34 AM: 'Singularity',
4:32:34 AM: 'Chainer',
4:32:34 AM: 'PyText',
4:32:34 AM: 'Pipeline.ai',
4:32:34 AM: 'Apache Bahir',
4:32:34 AM: 'NLP Architect',
4:32:34 AM: 'AllenNLP',
4:32:34 AM: 'Angel-ML',
4:32:34 AM: 'SEED RL',
4:32:34 AM: 'Coach',
4:32:34 AM: 'Gluon-NLP',
4:32:34 AM: 'DeepMind Lab',
4:32:34 AM: 'SEAL',
4:32:34 AM: 'MXNet',
4:32:34 AM: 'OpenAI Gym',
4:32:34 AM: 'MindMeld',
4:32:34 AM: 'CleverHans',
4:32:34 AM: 'Petastorm',
4:32:34 AM: 'Hawq',
4:32:34 AM: 'TF Encrypted',
4:32:34 AM: 'faust',
4:32:34 AM: 'Cortex',
4:32:34 AM: 'OpenDataology',
4:32:34 AM: 'YouTokenToMe',
4:32:34 AM: 'ALBERT',
4:32:34 AM: 'Adlik',
4:32:34 AM: '1chipML',
4:32:34 AM: 'Neural Network Distiller',
4:32:34 AM: 'Labelbox',
4:32:34 AM: 'Facets',
4:32:34 AM: 'OpenNN',
4:32:34 AM: 'Pilosa',
4:32:34 AM: 'Orchest',
4:32:34 AM: 'Model Server for Apache MXNet',
4:32:34 AM: 'LASER',
4:32:34 AM: 'Dopamine',
4:32:34 AM: 'MindSpore',
4:32:34 AM: 'HE Lib',
4:32:34 AM: 'd6tflow',
4:32:34 AM: 'Sonnet',
4:32:34 AM: 'Plaid ML',
4:32:34 AM: 'Nyoka',
4:32:34 AM: 'doccano',
4:32:34 AM: ... 253 more items
4:32:34 AM: ]
4:32:34 AM: ncc: Version 0.34.0
4:32:34 AM: ncc: Compiling file index.js into CJS
4:32:34 AM: ncc: Version 0.34.0
4:32:34 AM: ncc: Compiling file index.js into CJS
4:32:34 AM: ncc: Version 0.34.0
4:32:34 AM: ncc: Compiling file index.js into CJS
4:32:34 AM: Development server running at http://127.0.0.1:4000/
4:32:34 AM: Running integration-test,check-landscape,render-landscape,funding in parallel
4:32:34 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:32:34 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:32:34 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:32:34 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:32:34 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:32:34 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:32:34 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:32:34 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:32:34 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:32:34 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:32:34 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:32:34 AM: api request starting... /api/ids?category=&project=&license=&organization=accord-net-framework&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:32:34 AM: api request starting... /api/ids?category=&project=&license=&organization=microsoft&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:32:34 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:32:34 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:32:34 AM: Task: integration-test PASS specs/main.spec.js (9.897s)
4:32:34 AM: Main test
4:32:34 AM: I visit a main page and have all required elements
4:32:34 AM: ✓ I can open a page (1609ms)
4:32:34 AM: ✓ A proper header is present (5ms)
4:32:34 AM: ✓ Group headers are ok (2ms)
4:32:34 AM: ✓ I see a You are viewing text (1ms)
4:32:34 AM: ✓ A proper card is present (2ms)
4:32:34 AM: ✓ If I click on a card, I see a modal dialog (328ms)
4:32:34 AM: ✓ Closing a browser (29ms)
4:32:34 AM: Landscape Test
4:32:34 AM: ✓ I visit LF AI & Data Members landscape page and have all required elements, elements are clickable (1208ms)
4:32:34 AM: ✓ Closing a browser (22ms)
4:32:34 AM: ✓ I visit Companies Hosting Projects landscape page and have all required elements, elements are clickable (659ms)
4:32:34 AM: ✓ Closing a browser (20ms)
4:32:34 AM: I visit a main landscape page and have all required elements
4:32:34 AM: ✓ I open a landscape page and wait for it to load (1960ms)
4:32:34 AM: ✓ When I click on an item the modal is open (164ms)
4:32:34 AM: ✓ If I would straight open the url with a selected id, a modal appears (1950ms)
4:32:34 AM: ✓ Closing a browser (35ms)
4:32:34 AM: Filtering by organization
4:32:34 AM: ✓ Checking we see Accord.NET when filtering by organization Accord.NET Framework (595ms)
4:32:34 AM: ✓ Checking we don't see Accord.NET when filtering by organization Microsoft (226ms)
✓ Closing a browser (24ms)
4:32:34 AM: PASS specs/tools/actualTwitter.spec.js
4:32:34 AM: Twitter URL
4:32:34 AM: when crunchbase data not set
4:32:34 AM: ✓ returns URL from node (2ms)
4:32:34 AM: when node does not have twitter URL
4:32:34 AM: ✓ returns URL from node
4:32:34 AM: when node has twitter URL set to null
4:32:34 AM: ✓ returns undefined
4:32:34 AM: when both node and crunchbase have twitter URL
4:32:34 AM: ✓ returns URL from node
4:32:34 AM: when twitter URL is not set anywhere
4:32:34 AM: ✓ returns undefined (1ms)
4:32:34 AM: cleaning up twitter URL
4:32:34 AM: ✓ replaces http with https
4:32:34 AM: ✓ removes www
4:32:34 AM: ✓ query string
4:32:34 AM: Test Suites: 2 passed, 2 total
4:32:34 AM: Tests: 26 passed, 26 total
4:32:34 AM: Snapshots: 0 total
4:32:34 AM: Time: 10.131s
4:32:34 AM: Task: check-landscape
4:32:34 AM: Task: render-landscape visiting http://localhost:4000/fullscreen?version=2025-02-07T04:31:29Z d651106&scale=false&pdf
4:32:34 AM: visiting http://localhost:4000/fullscreen?version=2025-02-07T04:31:29Z d651106&scale=false&pdf
4:32:34 AM: Task: funding From https://github.com/lfai/lfai-landscape
4:32:34 AM: * [new branch] NSouthernLF-patch-1 -> github/NSouthernLF-patch-1
4:32:34 AM: * [new branch] NSouthernLF-patch-2 -> github/NSouthernLF-patch-2
4:32:34 AM: * [new branch] NSouthernLF-patch-3 -> github/NSouthernLF-patch-3
4:32:34 AM: * [new branch] NSouthernLF-patch-3-1 -> github/NSouthernLF-patch-3-1
4:32:34 AM: * [new branch] create-pull-request/patch-1693628094 -> github/create-pull-request/patch-1693628094
4:32:34 AM: * [new branch] create-pull-request/patch-1693714471 -> github/create-pull-request/patch-1693714471
4:32:34 AM: * [new branch] create-pull-request/patch-1703668428 -> github/create-pull-request/patch-1703668428
4:32:34 AM: * [new branch] main -> github/main
4:32:34 AM: * [new branch] revert-303-main -> github/revert-303-main
4:32:34 AM: * [new branch] web-landscape-2021-10-01T22-18 -> github/web-landscape-2021-10-01T22-18
4:32:34 AM: * [new branch] web-landscape-2021-10-12T16-06 -> github/web-landscape-2021-10-12T16-06
4:32:34 AM: * [new branch] web-landscape-2021-10-18T19-46 -> github/web-landscape-2021-10-18T19-46
4:32:34 AM: * [new branch] web-landscape-2021-10-20T18-58 -> github/web-landscape-2021-10-20T18-58
4:32:38 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:32:39 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:32:39 AM: * Documentation: https://help.ubuntu.com
4:32:39 AM: * Management: https://landscape.canonical.com
4:32:39 AM: * Support: https://ubuntu.com/advantage
4:32:39 AM: System information as of Fri Feb 7 04:32:38 UTC 2025
4:32:39 AM: System load: 1.82
4:32:39 AM: Usage of /: 71.9% of 217.51GB
4:32:39 AM: Memory usage: 18%
4:32:39 AM: Swap usage: 1%
4:32:39 AM: Processes: 623
4:32:39 AM: Users logged in: 1
4:32:39 AM: IPv4 address for bond0: 147.75.199.15
4:32:39 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:32:39 AM: IPv4 address for docker0: 172.17.0.1
4:32:39 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:32:39 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:32:39 AM: 85 updates can be applied immediately.
4:32:39 AM: 10 of these updates are standard security updates.
4:32:39 AM: To see these additional updates run: apt list --upgradable
4:32:39 AM: New release '22.04.5 LTS' available.
4:32:39 AM: Run 'do-release-upgrade' to upgrade to it.
4:32:39 AM: 2 updates could not be installed automatically. For more details,
4:32:39 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:32:39 AM: *** System restart required ***
4:32:39 AM: Remote build done!
4:32:39 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:32:39 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:32:39 AM: * Documentation: https://help.ubuntu.com
4:32:39 AM: * Management: https://landscape.canonical.com
4:32:39 AM: * Support: https://ubuntu.com/advantage
4:32:39 AM: System information as of Fri Feb 7 04:30:13 UTC 2025
4:32:39 AM: System load: 0.58
4:32:39 AM: Usage of /: 71.4% of 217.51GB
4:32:39 AM: Memory usage: 18%
4:32:39 AM: Swap usage: 1%
4:32:39 AM: Processes: 644
4:32:39 AM: Users logged in: 1
4:32:39 AM: IPv4 address for bond0: 147.75.199.15
4:32:39 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:32:39 AM: IPv4 address for docker0: 172.17.0.1
4:32:39 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:32:39 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:32:39 AM: 85 updates can be applied immediately.
4:32:39 AM: 10 of these updates are standard security updates.
4:32:39 AM: To see these additional updates run: apt list --upgradable
4:32:39 AM: New release '22.04.5 LTS' available.
4:32:39 AM: Run 'do-release-upgrade' to upgrade to it.
4:32:39 AM: 2 updates could not be installed automatically. For more details,
4:32:39 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:32:39 AM: *** System restart required ***
4:32:39 AM: /opt/buildhome/.nvm/nvm.sh
4:32:39 AM: .:
4:32:39 AM: ADOPTERS.md jest.config.js netlify.toml tools
4:32:39 AM: bin landscapes_dev package.json update_server
4:32:39 AM: build.sh landscapes.sh postcss.config.js _yarn.lock
4:32:39 AM: code-of-conduct.md landscapes.yml README.md yarn.lock
4:32:39 AM: files LICENSE server.js
4:32:39 AM: _headers netlify specs
4:32:39 AM: INSTALL.md netlify.md src
4:32:39 AM: v18.3
4:32:39 AM: Downloading and installing node v18.3.0...
4:32:39 AM: Computing checksum with sha256sum
4:32:39 AM: Checksums matched!
4:32:39 AM: Now using node v18.3.0 (npm v8.11.0)
4:32:39 AM: Now using node v18.3.0 (npm v8.11.0)
4:32:39 AM: npm WARN config global `--global`, `--local` are deprecated. Use `--location=global` instead.
4:32:39 AM:
4:32:39 AM: added 3 packages, and audited 4 packages in 520ms
4:32:39 AM: found 0 vulnerabilities
4:32:39 AM: npm WARN config global `--global`, `--local` are deprecated. Use `--location=global` instead.
4:32:39 AM:
4:32:39 AM: removed 12 packages, changed 105 packages, and audited 263 packages in 2s
4:32:39 AM: 27 packages are looking for funding
4:32:39 AM: run `npm fund` for details
4:32:39 AM: found 0 vulnerabilities
4:32:39 AM: added 1 package in 1s
4:32:39 AM: YN0000: ┌ Resolution step
4:32:39 AM: YN0000: └ Completed
4:32:39 AM: YN0000: ┌ Fetch step
4:32:39 AM: YN0013: │ 2 packages were already cached, 808 had to be fetched
4:32:39 AM: YN0000: └ Completed in 5s 617ms
4:32:39 AM: YN0000: ┌ Link step
4:32:39 AM: YN0000: │ ESM support for PnP uses the experimental loader API and is therefore experimental
4:32:39 AM: YN0007: │ puppeteer@npm:14.2.1 must be built because it never has been before or the last one failed
4:32:39 AM: YN0007: │ puppeteer@npm:13.2.0 must be built because it never has been before or the last one failed
4:32:39 AM: YN0007: │ yarn@npm:1.22.18 must be built because it never has been before or the last one failed
4:32:39 AM: YN0000: │ puppeteer@npm:14.2.1 STDERR
4:32:39 AM: YN0000: │ puppeteer@npm:13.2.0 STDERR
4:32:39 AM: YN0000: │ puppeteer@npm:14.2.1 STDOUT Chromium (1002410) downloaded to /opt/repo/packageRemote/.yarn/unplugged/puppeteer-npm-14.2.1-e2757bbf6c/node_modules/puppeteer/.local-chromium/linux-1002410
4:32:39 AM: YN0000: │ puppeteer@npm:13.2.0 STDOUT Chromium (961656) downloaded to /opt/repo/packageRemote/.yarn/unplugged/puppeteer-npm-13.2.0-5595d43df8/node_modules/puppeteer/.local-chromium/linux-961656
4:32:39 AM: YN0000: └ Completed in 6s 657ms
4:32:39 AM: YN0000: Done with warnings in 12s 692ms
4:32:39 AM: Processing the tree
4:32:39 AM: Running with a level=easy. Settings:
4:32:39 AM: Use cached crunchbase data: true
4:32:39 AM: Use cached images data: true
4:32:39 AM: Use cached twitter data: true
4:32:39 AM: Use cached github basic stats: true
4:32:39 AM: Use cached github start dates: true
4:32:39 AM: Use cached best practices: true
4:32:39 AM: Fetching crunchbase entries
4:32:39 AM: ................................................................................
4:32:39 AM: ................................................................................
4:32:39 AM: ................................................................................
4:32:39 AM: ................................................................................
4:32:39 AM: ................................................................................
4:32:39 AM: .......................................................**
4:32:39 AM: Fetching github entries
4:32:39 AM: ................................................................................
4:32:39 AM: ................................................................................
4:32:39 AM: ..................................*********************.........................
4:32:39 AM: ................................................................................
4:32:39 AM: ....................................................................*********
4:32:39 AM: Fetching start date entries
4:32:39 AM: ................................................................................
4:32:39 AM: ................................................................................
4:32:39 AM: ............................................***********.........................
4:32:39 AM: ................................................................................
4:32:39 AM: ..........................................................*******************
4:32:39 AM: Fetching images
4:32:39 AM: got image entries
4:32:39 AM: Hash for Prefect is prefect-2
4:32:39 AM: ................................................................................
4:32:39 AM: .....**.......**................................................................
4:32:39 AM: ................................................................................
4:32:39 AM: ................................................................................
4:32:39 AM: ................................................................................
4:32:39 AM: ................................................................................
4:32:39 AM: ...
4:32:39 AM: Fetching last tweet dates
4:32:39 AM: Fetching best practices
4:32:39 AM: ................................................................................
4:32:39 AM: ................................................................................
4:32:39 AM: ................................................................................
4:32:39 AM: ................................................................................
4:32:39 AM: ................................................
4:32:39 AM: Fetching CLOMonitor data
4:32:39 AM: Processing the tree
4:32:39 AM: saving!
4:32:39 AM: Hash for Prefect is prefect-2
4:32:39 AM: Warning: Open Platform for Enterprise AI (OPEA) has a twitter https://twitter.com/opeadev which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: Dragonfly has a twitter https://twitter.com/dragonfly_oss which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: Vald has a twitter https://twitter.com/vdaas_vald which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: FastTrackML has a twitter https://twitter.com/oss_gr which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: Mathesar has a twitter https://twitter.com/centerofci which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: RWKV has a twitter https://twitter.com/RWKV_AI which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: Langchain has a twitter https://twitter.com/LangChainAI which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: Haystack has a twitter https://twitter.com/deepset_ai which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: Armada has a twitter https://twitter.com/oss_gr which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: envd has a twitter https://twitter.com/TensorChord which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: NVIDIA Corporation has a twitter https://twitter.com/nvidia which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: Advanced Micro Devices (AMD) has a twitter https://twitter.com/AMD which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: Alibaba Cloud (Singapore) Private LTD has a twitter https://twitter.com/AlibabaB2B which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: Fujitsu Limited has a twitter https://twitter.com/Fujitsu_Global which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: GitCode has a twitter https://twitter.com/turingbook which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: Neo4j, Inc. has a twitter https://twitter.com/neo4j which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: New Relic, Inc. has a twitter https://twitter.com/newrelic which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: Posit has a twitter https://twitter.com/posit_pbc which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: Snyk Limited has a twitter https://twitter.com/snyksec which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: AI for People has a twitter https://twitter.com/AIforPeople which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: Ambianic.ai has a twitter https://twitter.com/ambianicai which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: Columbia University has a twitter https://twitter.com/Columbia which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: Common Crawl Foundation has a twitter https://twitter.com/commoncrawl which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: Ersilia Open Source Initiative has a twitter https://twitter.com/ersiliaio which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: Ewha Institute for Biomedical Law & Ethics has a twitter https://twitter.com/cybercampus which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: Montreal AI Ethics Institute has a twitter https://twitter.com/mtlaiethics which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: NIPA has a twitter https://twitter.com/NIPAkr which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: One Fact Foundation has a twitter https://twitter.com/onefact_org which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: Pranveer Singh Institute Of Technology has a twitter https://twitter.com/PSITKanpur2004 which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: Rensselaer Polytechnic Institute has a twitter https://twitter.com/rpi which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: Transport for NSW has a twitter https://twitter.com/TransportforNSW which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: University of Michigan has a twitter https://twitter.com/UMich which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: Wikimedia Deutschland e. V. has a twitter https://twitter.com/WikimediaDE which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: World Ethical Data Foundation has a twitter https://twitter.com/WEDF_foundation which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: BasicAI(hosting) has a twitter https://twitter.com/BasicAIteam which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: Fujitsu Limited(hosting) has a twitter https://twitter.com/Fujitsu_Global which is invalid or we just can not fetch its tweets
4:32:39 AM: Warning: Owkin(hosting) has a twitter https://twitter.com/owkinscience which is invalid or we just can not fetch its tweets
4:32:39 AM: Fetching members from LF AI & Data Member Company category
4:32:39 AM: Processing the tree
4:32:39 AM: Assigning Premier membership on DataHub (LinkedIn) because its parent Microsoft has Premier membership
4:32:39 AM: Assigning Premier membership on Brooklin (LinkedIn) because its parent Microsoft has Premier membership
4:32:39 AM: Assigning Premier membership on Azkaban (LinkedIn) because its parent Microsoft has Premier membership
4:32:39 AM: Assigning Premier membership on LinkedIn (hosting) (LinkedIn) because its parent Microsoft has Premier membership
4:32:39 AM: Hash for Chaitanya Bharathi Institute of Technology is chaitanya-bharathi-institute-of-technology-2
4:32:39 AM: Hash for Fast.ai is fast-ai-2
4:32:39 AM: Hash for Great Expectations is great-expectations-2
4:32:39 AM: Hash for ML Perf is ml-perf-2
4:32:39 AM: Hash for PipelineAI is pipeline-ai-2
4:32:39 AM: Hash for Prefect is prefect-2
4:32:39 AM: Hash for Redash is redash-2
4:32:39 AM: Hash for Wikimedia Deutschland e.V. is wikimedia-deutschland-e-v-2
4:32:39 AM: {
4:32:39 AM: name: 'Accord.NET',
4:32:39 AM: homepage_url: 'http://accord-framework.net/',
4:32:39 AM: logo: 'accord-net.svg',
4:32:39 AM: github_data: {
4:32:39 AM: languages: [
4:32:39 AM: [Object], [Object],
4:32:39 AM: [Object], [Object],
4:32:39 AM: [Object], [Object],
4:32:39 AM: [Object], [Object],
4:32:39 AM: [Object], [Object],
4:32:39 AM: [Object], [Object],
4:32:39 AM: [Object], [Object],
4:32:39 AM: [Object]
4:32:39 AM: ],
4:32:39 AM: contributions: '0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0',
4:32:39 AM: firstWeek: '2022-11-27Z',
4:32:39 AM: stars: 4404,
4:32:39 AM: license: 'GNU Lesser General Public License v2.1',
4:32:39 AM: description: 'Machine learning, computer vision, statistics and general scientific computing for .NET',
4:32:39 AM: latest_commit_date: '2020-11-18T19:53:01Z',
4:32:39 AM: latest_commit_link: '/accord-net/framework/commit/1ab0cc0ba55bcc3d46f20e7bbe7224b58cd01854',
4:32:39 AM: release_date: '2017-10-19T21:00:56Z',
4:32:39 AM: contributors_count: 98,
4:32:39 AM: },
4:32:39 AM: repos: [ { url: 'https://github.com/accord-net/framework', stars: 4404 } ],
4:32:39 AM: github_start_commit_data: {
4:32:39 AM: start_commit_link: '/accord-net/framework/commit/5e834efc89c071bb32c84f41124d5c9a5bbfe1e3',
4:32:39 AM: start_date: '2012-04-08T14:05:58Z'
4:32:39 AM: },
4:32:39 AM: image_data: {
4:32:39 AM: fileName: 'accord-net.svg',
4:32:39 AM: hash: 'tdvFWfo6hfZQqHqjscAtcoZWk6qztZFzBTVqKWUjYig='
4:32:39 AM: },
4:32:39 AM: firstCommitDate: '2012-04-08T14:05:58Z',
4:32:39 AM: latestCommitDate: '2020-11-18T19:53:01Z',
4:32:39 AM: releaseDate: '2017-10-19T21:00:56Z',
4:32:39 AM: commitsThisYear: 0,
4:32:39 AM: contributorsCount: 98,
4:32:39 AM: language: 'C#',
4:32:39 AM: stars: 4404,
4:32:39 AM: license: 'GNU Lesser General Public License v2.1',
4:32:39 AM: headquarters: 'Grenoble, France',
4:32:39 AM: description: 'Machine learning, computer vision, statistics and general scientific computing for .NET',
4:32:39 AM: organization: 'Accord.NET Framework',
4:32:39 AM: crunchbaseData: {
4:32:39 AM: name: 'Accord.NET Framework',
4:32:39 AM: description: 'Machine Learning Framework',
4:32:39 AM: homepage: 'http://accord-framework.net/',
4:32:39 AM: city: 'Grenoble',
4:32:39 AM: region: 'Rhone-Alpes',
4:32:39 AM: country: 'France',
4:32:39 AM: twitter: null,
4:32:39 AM: linkedin: null,
4:32:39 AM: acquisitions: [],
4:32:39 AM: parents: [],
4:32:39 AM: stockExchange: null,
4:32:39 AM: company_type: 'Non Profit',
4:32:39 AM: industries: [
4:32:39 AM: 'Analytics',
4:32:39 AM: 'Artificial Intelligence',
4:32:39 AM: 'Hardware',
4:32:39 AM: 'Machine Learning'
4:32:39 AM: ],
4:32:39 AM: numEmployeesMin: null,
4:32:39 AM: numEmployeesMax: null
4:32:39 AM: },
4:32:39 AM: path: 'Machine Learning / Framework',
4:32:39 AM: landscape: 'Machine Learning / Framework',
4:32:39 AM: category: 'Machine Learning',
4:32:39 AM: amount: 'N/A',
4:32:39 AM: oss: true,
4:32:39 AM: href: 'logos/accord-net.svg',
4:32:39 AM: bestPracticeBadgeId: false,
4:32:39 AM: bestPracticePercentage: null,
4:32:39 AM: industries: [
4:32:39 AM: 'Analytics',
4:32:39 AM: 'Artificial Intelligence',
4:32:39 AM: 'Hardware',
4:32:39 AM: 'Machine Learning'
4:32:39 AM: ],
4:32:39 AM: starsPresent: true,
4:32:39 AM: starsAsText: '4,404',
4:32:39 AM: marketCapPresent: false,
4:32:39 AM: marketCapAsText: 'N/A',
4:32:39 AM: id: 'accord-net',
4:32:39 AM: flatName: 'Accord.NET',
4:32:39 AM: member: false,
4:32:39 AM: relation: false,
4:32:39 AM: isSubsidiaryProject: false
4:32:39 AM: } 2020-11-18T19:53:01Z
4:32:39 AM: [
4:32:39 AM: 'Community Data License Agreement (CDLA)',
4:32:39 AM: 'PlaNet',
4:32:39 AM: 'Generic Neural Elastic Search (GNES)',
4:32:39 AM: 'PredictionIO',
4:32:39 AM: 'ELI5',
4:32:39 AM: 'BERT',
4:32:39 AM: 'Nauta',
4:32:39 AM: 'DAWNBench',
4:32:39 AM: 'AresDB',
4:32:39 AM: 'dotmesh',
4:32:39 AM: 'Audit AI',
4:32:39 AM: 'euler',
4:32:39 AM: 'Clipper',
4:32:39 AM: 'Accord.NET',
4:32:39 AM: 'Shogun',
4:32:39 AM: 'DELTA',
4:32:39 AM: 'BeakerX',
4:32:39 AM: 'PixieDust',
4:32:39 AM: 'TreeInterpreter',
4:32:39 AM: 'Cyclone',
4:32:39 AM: 'Lucid',
4:32:39 AM: 'XLM',
4:32:39 AM: 'Chainer RL',
4:32:39 AM: 'ForestFlow',
4:32:39 AM: 'uReplicator',
4:32:39 AM: 'Elastic Deep Learning (EDL)',
4:32:39 AM: 'Kashgari',
4:32:39 AM: 'DataPractices',
4:32:39 AM: 'X-DeepLearning',
4:32:39 AM: 'LIME',
4:32:39 AM: 'Model Asset eXchange (MAX)',
4:32:39 AM: 'TransmogrifAI',
4:32:39 AM: 'OpenBytes',
4:32:39 AM: 'DeepLIFT',
4:32:39 AM: 'Onepanel',
4:32:39 AM: 'DeepSpeech',
4:32:39 AM: 'Lucene',
4:32:39 AM: 'Turi Create',
4:32:39 AM: 'Visual Object Tagging Tool (VoTT)',
4:32:39 AM: 'Acumos',
4:32:39 AM: 'Skater',
4:32:39 AM: 'Catalyst',
4:32:39 AM: 'SKIP Language',
4:32:39 AM: 'SQLFlow',
4:32:39 AM: 'Advertorch',
4:32:39 AM: 'xLearn',
4:32:39 AM: 'Neuropod',
4:32:39 AM: 'AdvBox',
4:32:39 AM: 'RCloud',
4:32:39 AM: 'Neo-AI',
4:32:39 AM: 'Embedded Learning Library',
4:32:39 AM: 'Stable Baselines',
4:32:39 AM: 'talos',
4:32:39 AM: 'LabelImg',
4:32:39 AM: 'MMdnn',
4:32:39 AM: 'CNTK',
4:32:39 AM: 'Machine Learning eXchange',
4:32:39 AM: 'Singularity',
4:32:39 AM: 'Chainer',
4:32:39 AM: 'PyText',
4:32:39 AM: 'Pipeline.ai',
4:32:39 AM: 'Apache Bahir',
4:32:39 AM: 'NLP Architect',
4:32:39 AM: 'AllenNLP',
4:32:39 AM: 'Angel-ML',
4:32:39 AM: 'SEED RL',
4:32:39 AM: 'Coach',
4:32:39 AM: 'Gluon-NLP',
4:32:39 AM: 'DeepMind Lab',
4:32:39 AM: 'SEAL',
4:32:39 AM: 'MXNet',
4:32:39 AM: 'OpenAI Gym',
4:32:39 AM: 'MindMeld',
4:32:39 AM: 'CleverHans',
4:32:39 AM: 'Petastorm',
4:32:39 AM: 'Hawq',
4:32:39 AM: 'TF Encrypted',
4:32:39 AM: 'faust',
4:32:39 AM: 'Cortex',
4:32:39 AM: 'OpenDataology',
4:32:39 AM: 'YouTokenToMe',
4:32:39 AM: 'ALBERT',
4:32:39 AM: 'Adlik',
4:32:39 AM: '1chipML',
4:32:39 AM: 'Neural Network Distiller',
4:32:39 AM: 'Labelbox',
4:32:39 AM: 'Facets',
4:32:39 AM: 'OpenNN',
4:32:39 AM: 'Pilosa',
4:32:39 AM: 'Orchest',
4:32:39 AM: 'Model Server for Apache MXNet',
4:32:39 AM: 'LASER',
4:32:39 AM: 'Dopamine',
4:32:39 AM: 'MindSpore',
4:32:39 AM: 'HE Lib',
4:32:39 AM: 'd6tflow',
4:32:39 AM: 'Sonnet',
4:32:39 AM: 'Plaid ML',
4:32:39 AM: 'Nyoka',
4:32:39 AM: 'doccano',
4:32:39 AM: ... 253 more items
4:32:39 AM: ]
4:32:39 AM: ncc: Version 0.34.0
4:32:39 AM: ncc: Compiling file index.js into CJS
4:32:39 AM: ncc: Version 0.34.0
4:32:39 AM: ncc: Compiling file index.js into CJS
4:32:39 AM: ncc: Version 0.34.0
4:32:39 AM: ncc: Compiling file index.js into CJS
4:32:39 AM: Development server running at http://127.0.0.1:4000/
4:32:39 AM: Running integration-test,check-landscape,render-landscape,funding in parallel
4:32:39 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:32:39 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:32:39 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:32:39 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:32:39 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:32:39 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:32:39 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:32:39 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:32:39 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:32:39 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:32:39 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:32:39 AM: api request starting... /api/ids?category=&project=&license=&organization=accord-net-framework&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:32:39 AM: api request starting... /api/ids?category=&project=&license=&organization=microsoft&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:32:39 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:32:39 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:32:39 AM: Task: integration-test PASS specs/main.spec.js (9.897s)
4:32:39 AM: Main test
4:32:39 AM: I visit a main page and have all required elements
4:32:39 AM: ✓ I can open a page (1609ms)
4:32:39 AM: ✓ A proper header is present (5ms)
4:32:39 AM: ✓ Group headers are ok (2ms)
4:32:39 AM: ✓ I see a You are viewing text (1ms)
4:32:39 AM: ✓ A proper card is present (2ms)
4:32:39 AM: ✓ If I click on a card, I see a modal dialog (328ms)
4:32:39 AM: ✓ Closing a browser (29ms)
4:32:39 AM: Landscape Test
4:32:39 AM: ✓ I visit LF AI & Data Members landscape page and have all required elements, elements are clickable (1208ms)
4:32:39 AM: ✓ Closing a browser (22ms)
4:32:39 AM: ✓ I visit Companies Hosting Projects landscape page and have all required elements, elements are clickable (659ms)
4:32:39 AM: ✓ Closing a browser (20ms)
4:32:39 AM: I visit a main landscape page and have all required elements
4:32:39 AM: ✓ I open a landscape page and wait for it to load (1960ms)
4:32:39 AM: ✓ When I click on an item the modal is open (164ms)
4:32:39 AM: ✓ If I would straight open the url with a selected id, a modal appears (1950ms)
4:32:39 AM: ✓ Closing a browser (35ms)
4:32:39 AM: Filtering by organization
4:32:39 AM: ✓ Checking we see Accord.NET when filtering by organization Accord.NET Framework (595ms)
4:32:39 AM: ✓ Checking we don't see Accord.NET when filtering by organization Microsoft (226ms)
✓ Closing a browser (24ms)
4:32:39 AM: PASS specs/tools/actualTwitter.spec.js
4:32:39 AM: Twitter URL
4:32:39 AM: when crunchbase data not set
4:32:39 AM: ✓ returns URL from node (2ms)
4:32:39 AM: when node does not have twitter URL
4:32:39 AM: ✓ returns URL from node
4:32:39 AM: when node has twitter URL set to null
4:32:39 AM: ✓ returns undefined
4:32:39 AM: when both node and crunchbase have twitter URL
4:32:39 AM: ✓ returns URL from node
4:32:39 AM: when twitter URL is not set anywhere
4:32:39 AM: ✓ returns undefined (1ms)
4:32:39 AM: cleaning up twitter URL
4:32:39 AM: ✓ replaces http with https
4:32:39 AM: ✓ removes www
4:32:39 AM: ✓ query string
4:32:39 AM: Test Suites: 2 passed, 2 total
4:32:39 AM: Tests: 26 passed, 26 total
4:32:39 AM: Snapshots: 0 total
4:32:39 AM: Time: 10.131s
4:32:39 AM: Task: check-landscape
4:32:39 AM: Task: render-landscape visiting http://localhost:4000/fullscreen?version=2025-02-07T04:31:29Z d651106&scale=false&pdf
4:32:39 AM: visiting http://localhost:4000/fullscreen?version=2025-02-07T04:31:29Z d651106&scale=false&pdf
4:32:39 AM: Task: funding From https://github.com/lfai/lfai-landscape
4:32:39 AM: * [new branch] NSouthernLF-patch-1 -> github/NSouthernLF-patch-1
4:32:39 AM: * [new branch] NSouthernLF-patch-2 -> github/NSouthernLF-patch-2
4:32:39 AM: * [new branch] NSouthernLF-patch-3 -> github/NSouthernLF-patch-3
4:32:39 AM: * [new branch] NSouthernLF-patch-3-1 -> github/NSouthernLF-patch-3-1
4:32:39 AM: * [new branch] create-pull-request/patch-1693628094 -> github/create-pull-request/patch-1693628094
4:32:39 AM: * [new branch] create-pull-request/patch-1693714471 -> github/create-pull-request/patch-1693714471
4:32:39 AM: * [new branch] create-pull-request/patch-1703668428 -> github/create-pull-request/patch-1703668428
4:32:39 AM: * [new branch] main -> github/main
4:32:39 AM: * [new branch] revert-303-main -> github/revert-303-main
4:32:39 AM: * [new branch] web-landscape-2021-10-01T22-18 -> github/web-landscape-2021-10-01T22-18
4:32:39 AM: * [new branch] web-landscape-2021-10-12T16-06 -> github/web-landscape-2021-10-12T16-06
4:32:39 AM: * [new branch] web-landscape-2021-10-18T19-46 -> github/web-landscape-2021-10-18T19-46
4:32:39 AM: * [new branch] web-landscape-2021-10-20T18-58 -> github/web-landscape-2021-10-20T18-58
4:32:40 AM: ​
4:32:40 AM: (build.command completed in 2m 33.3s)
4:32:40 AM:
4:32:40 AM: Functions bundling
4:32:40 AM: ────────────────────────────────────────────────────────────────
4:32:40 AM: ​
4:32:40 AM: Packaging Functions from /opt/build/repo/functions directory:
4:32:40 AM: - export.js
4:32:40 AM: - ids.js
4:32:40 AM: - items.js
4:32:40 AM: ​
4:32:44 AM: ​
4:32:44 AM: (Functions bundling completed in 4.2s)
4:32:44 AM:
4:32:54 AM: (Netlify Build completed in 2m 47.4s)
4:32:55 AM: Section completed: building
4:33:11 AM: Finished processing build request in 3m34.233s

Deploying

Complete
4:32:44 AM: Deploy site
4:32:44 AM: ────────────────────────────────────────────────────────────────
4:32:44 AM: ​
4:32:44 AM: Starting to deploy site from 'dist'
4:32:44 AM: Calculating files to upload
4:32:47 AM: 10 new file(s) to upload
4:32:47 AM: 3 new function(s) to upload
4:32:54 AM: Section completed: deploying
4:32:54 AM: Site deploy was successfully initiated
4:32:54 AM: ​
4:32:54 AM: (Deploy site completed in 9.7s)

Cleanup

Complete
4:32:54 AM: Netlify Build Complete
4:32:54 AM: ────────────────────────────────────────────────────────────────
4:32:54 AM: ​
4:32:54 AM: Caching artifacts
4:32:54 AM: Started saving build plugins
4:32:55 AM: Finished saving build plugins
4:32:55 AM: Started saving mise cache
4:32:55 AM: Finished saving mise cache
4:32:55 AM: Started saving pip cache
4:32:55 AM: Finished saving pip cache
4:32:55 AM: Started saving emacs cask dependencies
4:32:55 AM: Finished saving emacs cask dependencies
4:32:55 AM: Started saving maven dependencies
4:32:55 AM: Finished saving maven dependencies
4:32:55 AM: Started saving boot dependencies
4:32:55 AM: Finished saving boot dependencies
4:32:55 AM: Started saving rust rustup cache
4:32:55 AM: Finished saving rust rustup cache
4:32:55 AM: Started saving go dependencies
4:32:55 AM: Finished saving go dependencies
4:32:55 AM: Build script success
4:33:10 AM: Uploading Cache of size 195.7MB
4:33:11 AM: Section completed: cleanup

Post-processing

Complete
4:32:54 AM: Post processing - redirect rules
4:32:54 AM: Starting post processing
4:32:54 AM: Skipping form detection
4:32:54 AM: Post processing - header rules
4:32:54 AM: Post processing done
4:32:54 AM: Section completed: postprocessing
4:32:55 AM: Site is live ✨