Deploy details
Published deploy for lfailandscape
Update Landscape from LFX 2025-02-13 (#839)Production: main@6a7287a
Deploy log
Initializing
Complete
Initializing
Complete
4:29:28 AM: Build ready to start
4:29:45 AM: build-image version: 9c9fb6952e50bb092d4b66daf2368677e5c68e34 (focal)
4:29:45 AM: buildbot version: 9c9fb6952e50bb092d4b66daf2368677e5c68e34
4:29:45 AM: Fetching cached dependencies
4:29:45 AM: Starting to download cache of 216.2MB
4:29:48 AM: Finished downloading cache in 2.737s
4:29:48 AM: Starting to extract cache
4:29:50 AM: Finished extracting cache in 1.938s
4:29:50 AM: Finished fetching cache in 4.739s
4:29:50 AM: Starting to prepare the repo for build
4:29:50 AM: Preparing Git Reference refs/heads/main
4:29:51 AM: Custom build path detected. Proceeding with the specified path: 'netlify'
4:29:51 AM: Custom functions path detected. Proceeding with the specified path: 'functions'
4:29:51 AM: Custom build command detected. Proceeding with the specified command: '(wget --no-check-certificate --no-cache https://raw.githubusercontent.com/cncf/landscapeapp/master/netlify/landscape.js) && node landscape.js'
4:29:51 AM: Custom ignore command detected. Proceeding with the specified command: 'false'
4:29:52 AM: manpath: warning: $PATH not set
4:29:52 AM: Starting to install dependencies
4:29:52 AM: Started restoring cached mise cache
4:29:53 AM: Finished restoring cached mise cache
4:29:53 AM: mise python@3.13.2 install
4:29:54 AM: mise python@3.13.2 download cpython-3.13.2+20250212-x86_64-unknown-linux-gnu-install_only_stripped.tar.gz
4:29:54 AM: mise python@3.13.2 extract cpython-3.13.2+20250212-x86_64-unknown-linux-gnu-install_only_stripped.tar.gz
4:29:54 AM: mise python@3.13.2 python --version
4:29:54 AM: mise python@3.13.2 Python 3.13.2
4:29:54 AM: mise python@3.13.2 ✓ installed
4:29:54 AM: Python version set to 3.13
4:29:56 AM: Collecting pipenv
4:29:56 AM: Downloading pipenv-2024.4.1-py3-none-any.whl.metadata (17 kB)
4:29:56 AM: Collecting certifi (from pipenv)
4:29:56 AM: Downloading certifi-2025.1.31-py3-none-any.whl.metadata (2.5 kB)
4:29:56 AM: Collecting packaging>=22 (from pipenv)
4:29:56 AM: Downloading packaging-24.2-py3-none-any.whl.metadata (3.2 kB)
4:29:56 AM: Collecting setuptools>=67 (from pipenv)
4:29:56 AM: Downloading setuptools-75.8.0-py3-none-any.whl.metadata (6.7 kB)
4:29:56 AM: Collecting virtualenv>=20.24.2 (from pipenv)
4:29:56 AM: Downloading virtualenv-20.29.2-py3-none-any.whl.metadata (4.5 kB)
4:29:56 AM: Collecting distlib<1,>=0.3.7 (from virtualenv>=20.24.2->pipenv)
4:29:56 AM: Downloading distlib-0.3.9-py2.py3-none-any.whl.metadata (5.2 kB)
4:29:56 AM: Collecting filelock<4,>=3.12.2 (from virtualenv>=20.24.2->pipenv)
4:29:56 AM: Downloading filelock-3.17.0-py3-none-any.whl.metadata (2.9 kB)
4:29:56 AM: Collecting platformdirs<5,>=3.9.1 (from virtualenv>=20.24.2->pipenv)
4:29:56 AM: Downloading platformdirs-4.3.6-py3-none-any.whl.metadata (11 kB)
4:29:56 AM: Downloading pipenv-2024.4.1-py3-none-any.whl (3.0 MB)
4:29:56 AM: ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 3.0/3.0 MB 55.6 MB/s eta 0:00:00
4:29:57 AM: Downloading packaging-24.2-py3-none-any.whl (65 kB)
4:29:57 AM: Downloading setuptools-75.8.0-py3-none-any.whl (1.2 MB)
4:29:57 AM: ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.2/1.2 MB 64.2 MB/s eta 0:00:00
4:29:57 AM: Downloading virtualenv-20.29.2-py3-none-any.whl (4.3 MB)
4:29:57 AM: ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 4.3/4.3 MB 140.0 MB/s eta 0:00:00
4:29:57 AM: Downloading certifi-2025.1.31-py3-none-any.whl (166 kB)
4:29:57 AM: Downloading distlib-0.3.9-py2.py3-none-any.whl (468 kB)
4:29:57 AM: Downloading filelock-3.17.0-py3-none-any.whl (16 kB)
4:29:57 AM: Downloading platformdirs-4.3.6-py3-none-any.whl (18 kB)
4:29:57 AM: Installing collected packages: distlib, setuptools, platformdirs, packaging, filelock, certifi, virtualenv, pipenv
4:29:59 AM: Successfully installed certifi-2025.1.31 distlib-0.3.9 filelock-3.17.0 packaging-24.2 pipenv-2024.4.1 platformdirs-4.3.6 setuptools-75.8.0 virtualenv-20.29.2
4:29:59 AM: [notice] A new release of pip is available: 24.3.1 -> 25.0.1
4:29:59 AM: [notice] To update, run: pip install --upgrade pip
4:29:59 AM: Attempting Ruby version 2.6.2, read from environment
4:29:59 AM: Started restoring cached Ruby version
4:29:59 AM: Finished restoring cached Ruby version
4:30:00 AM: Using Ruby version 2.6.2
4:30:01 AM: Started restoring cached go cache
4:30:01 AM: Finished restoring cached go cache
4:30:01 AM: Installing Go version 1.12 (requested 1.12)
4:30:05 AM: go version go1.12 linux/amd64
4:30:06 AM: Using PHP version 8.0
4:30:08 AM: Started restoring cached Node.js version
4:30:09 AM: Finished restoring cached Node.js version
4:30:09 AM: v14.3.0 is already installed.
4:30:10 AM: Now using node v14.3.0 (npm v6.14.5)
4:30:10 AM: Started restoring cached build plugins
4:30:10 AM: Finished restoring cached build plugins
4:30:10 AM: Successfully installed dependencies
4:30:10 AM: Starting build script
4:30:11 AM: Detected 1 framework(s)
4:30:12 AM: "cecil" at version "unknown"
4:30:12 AM: Section completed: initializing
Building
Complete
Building
Complete
4:30:13 AM: Netlify Build
4:30:13 AM: ────────────────────────────────────────────────────────────────
4:30:13 AM:
4:30:13 AM: ❯ Version
4:30:13 AM: @netlify/build 29.58.9
4:30:13 AM:
4:30:13 AM: ❯ Flags
4:30:13 AM: accountId: 5a55185e8198766884f04205
4:30:13 AM: baseRelDir: false
4:30:13 AM: buildId: 67ad752841f88100087c1710
4:30:13 AM: deployId: 67ad752841f88100087c1712
4:30:13 AM:
4:30:13 AM: ❯ Current directory
4:30:13 AM: /opt/build/repo/netlify
4:30:13 AM:
4:30:13 AM: ❯ Config file
4:30:13 AM: /opt/build/repo/netlify.toml
4:30:13 AM:
4:30:13 AM: ❯ Context
4:30:13 AM: production
4:30:13 AM:
4:30:13 AM: build.command from netlify.toml
4:30:13 AM: ────────────────────────────────────────────────────────────────
4:30:13 AM:
4:30:13 AM: $ (wget --no-check-certificate --no-cache https://raw.githubusercontent.com/cncf/landscapeapp/master/netlify/landscape.js) && node landscape.js
4:30:13 AM: --2025-02-13 04:30:13-- https://raw.githubusercontent.com/cncf/landscapeapp/master/netlify/landscape.js
4:30:13 AM: Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 185.199.109.133, 185.199.110.133, 185.199.111.133, ...
4:30:13 AM: Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|185.199.109.133|:443... connected.
4:30:13 AM: HTTP request sent, awaiting response... 200 OK
4:30:13 AM: Length: 8750 (8.5K) [text/plain]
4:30:13 AM: Saving to: ‘landscape.js’
4:30:13 AM: 0K ........ 100% 144M=0s
4:30:13 AM: 2025-02-13 04:30:13 (144 MB/s) - ‘landscape.js’ saved [8750/8750]
4:30:13 AM: We have a secret: c8***75
4:30:13 AM: We have a secret: 8G***pb
4:30:13 AM: We have a secret: 87***eb
4:30:13 AM: We have a secret: gh***7r
4:30:13 AM: starting /opt/build/repo/netlify
4:30:13 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:30:13 AM: Warning: Permanently added '147.75.199.15' (ECDSA) to the list of known hosts.
4:30:14 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:30:14 AM: * Documentation: https://help.ubuntu.com
4:30:14 AM: * Management: https://landscape.canonical.com
4:30:14 AM: * Support: https://ubuntu.com/advantage
4:30:14 AM: System information as of Thu Feb 13 04:30:14 UTC 2025
4:30:14 AM: System load: 0.45
4:30:14 AM: Usage of /: 75.8% of 217.51GB
4:30:14 AM: Memory usage: 18%
4:30:14 AM: Swap usage: 1%
4:30:14 AM: Processes: 644
4:30:14 AM: Users logged in: 1
4:30:14 AM: IPv4 address for bond0: 147.75.199.15
4:30:14 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:30:14 AM: IPv4 address for docker0: 172.17.0.1
4:30:14 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:30:14 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:30:14 AM: 82 updates can be applied immediately.
4:30:14 AM: 7 of these updates are standard security updates.
4:30:14 AM: To see these additional updates run: apt list --upgradable
4:30:14 AM: New release '22.04.5 LTS' available.
4:30:14 AM: Run 'do-release-upgrade' to upgrade to it.
4:30:14 AM: 2 updates could not be installed automatically. For more details,
4:30:14 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:30:14 AM: *** System restart required ***
4:30:14 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:30:14 AM: Warning: Permanently added '147.75.199.15' (ECDSA) to the list of known hosts.
4:30:14 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:30:14 AM: * Documentation: https://help.ubuntu.com
4:30:14 AM: * Management: https://landscape.canonical.com
4:30:14 AM: * Support: https://ubuntu.com/advantage
4:30:14 AM: System information as of Thu Feb 13 04:30:14 UTC 2025
4:30:14 AM: System load: 0.45
4:30:14 AM: Usage of /: 75.8% of 217.51GB
4:30:14 AM: Memory usage: 18%
4:30:14 AM: Swap usage: 1%
4:30:14 AM: Processes: 644
4:30:14 AM: Users logged in: 1
4:30:14 AM: IPv4 address for bond0: 147.75.199.15
4:30:14 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:30:14 AM: IPv4 address for docker0: 172.17.0.1
4:30:14 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:30:14 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:30:14 AM: 82 updates can be applied immediately.
4:30:14 AM: 7 of these updates are standard security updates.
4:30:14 AM: To see these additional updates run: apt list --upgradable
4:30:14 AM: New release '22.04.5 LTS' available.
4:30:14 AM: Run 'do-release-upgrade' to upgrade to it.
4:30:14 AM: 2 updates could not be installed automatically. For more details,
4:30:14 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:30:14 AM: *** System restart required ***
4:30:14 AM: Cloning into 'packageRemote'...
4:30:14 AM: node version: v18.3
4:30:14 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:30:15 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:30:15 AM: * Documentation: https://help.ubuntu.com
4:30:15 AM: * Management: https://landscape.canonical.com
4:30:15 AM: * Support: https://ubuntu.com/advantage
4:30:15 AM: System information as of Thu Feb 13 04:30:15 UTC 2025
4:30:15 AM: System load: 0.42
4:30:15 AM: Usage of /: 75.8% of 217.51GB
4:30:15 AM: Memory usage: 18%
4:30:15 AM: Swap usage: 1%
4:30:15 AM: Processes: 641
4:30:15 AM: Users logged in: 1
4:30:15 AM: IPv4 address for bond0: 147.75.199.15
4:30:15 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:30:15 AM: IPv4 address for docker0: 172.17.0.1
4:30:15 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:30:15 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:30:15 AM: 82 updates can be applied immediately.
4:30:15 AM: 7 of these updates are standard security updates.
4:30:15 AM: To see these additional updates run: apt list --upgradable
4:30:15 AM: New release '22.04.5 LTS' available.
4:30:15 AM: Run 'do-release-upgrade' to upgrade to it.
4:30:15 AM: 2 updates could not be installed automatically. For more details,
4:30:15 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:30:15 AM: *** System restart required ***
4:30:15 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:30:15 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:30:15 AM: * Documentation: https://help.ubuntu.com
4:30:15 AM: * Management: https://landscape.canonical.com
4:30:15 AM: * Support: https://ubuntu.com/advantage
4:30:15 AM: System information as of Thu Feb 13 04:30:15 UTC 2025
4:30:15 AM: System load: 0.42
4:30:15 AM: Usage of /: 75.8% of 217.51GB
4:30:15 AM: Memory usage: 18%
4:30:15 AM: Swap usage: 1%
4:30:15 AM: Processes: 641
4:30:15 AM: Users logged in: 1
4:30:15 AM: IPv4 address for bond0: 147.75.199.15
4:30:15 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:30:15 AM: IPv4 address for docker0: 172.17.0.1
4:30:15 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:30:15 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:30:15 AM: 82 updates can be applied immediately.
4:30:15 AM: 7 of these updates are standard security updates.
4:30:15 AM: To see these additional updates run: apt list --upgradable
4:30:15 AM: New release '22.04.5 LTS' available.
4:30:15 AM: Run 'do-release-upgrade' to upgrade to it.
4:30:15 AM: 2 updates could not be installed automatically. For more details,
4:30:15 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:30:15 AM: *** System restart required ***
4:30:15 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:30:16 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:30:16 AM: * Documentation: https://help.ubuntu.com
4:30:16 AM: * Management: https://landscape.canonical.com
4:30:16 AM: * Support: https://ubuntu.com/advantage
4:30:16 AM: System information as of Thu Feb 13 04:30:15 UTC 2025
4:30:16 AM: System load: 0.42
4:30:16 AM: Usage of /: 75.8% of 217.51GB
4:30:16 AM: Memory usage: 18%
4:30:16 AM: Swap usage: 1%
4:30:16 AM: Processes: 642
4:30:16 AM: Users logged in: 1
4:30:16 AM: IPv4 address for bond0: 147.75.199.15
4:30:16 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:30:16 AM: IPv4 address for docker0: 172.17.0.1
4:30:16 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:30:16 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:30:16 AM: 82 updates can be applied immediately.
4:30:16 AM: 7 of these updates are standard security updates.
4:30:16 AM: To see these additional updates run: apt list --upgradable
4:30:16 AM: New release '22.04.5 LTS' available.
4:30:16 AM: Run 'do-release-upgrade' to upgrade to it.
4:30:16 AM: 2 updates could not be installed automatically. For more details,
4:30:16 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:30:16 AM: *** System restart required ***
4:30:16 AM: focal: Pulling from netlify/build
4:30:16 AM: Digest: sha256:a53361ff11a8e42c6088aa85a401da4ac76b8c9bed96731e2f85536028417ef1
4:30:16 AM: Status: Image is up to date for netlify/build:focal
4:30:16 AM: docker.io/netlify/build:focal
4:30:16 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:30:16 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:30:16 AM: * Documentation: https://help.ubuntu.com
4:30:16 AM: * Management: https://landscape.canonical.com
4:30:16 AM: * Support: https://ubuntu.com/advantage
4:30:16 AM: System information as of Thu Feb 13 04:30:15 UTC 2025
4:30:16 AM: System load: 0.42
4:30:16 AM: Usage of /: 75.8% of 217.51GB
4:30:16 AM: Memory usage: 18%
4:30:16 AM: Swap usage: 1%
4:30:16 AM: Processes: 642
4:30:16 AM: Users logged in: 1
4:30:16 AM: IPv4 address for bond0: 147.75.199.15
4:30:16 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:30:16 AM: IPv4 address for docker0: 172.17.0.1
4:30:16 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:30:16 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:30:16 AM: 82 updates can be applied immediately.
4:30:16 AM: 7 of these updates are standard security updates.
4:30:16 AM: To see these additional updates run: apt list --upgradable
4:30:16 AM: New release '22.04.5 LTS' available.
4:30:16 AM: Run 'do-release-upgrade' to upgrade to it.
4:30:16 AM: 2 updates could not be installed automatically. For more details,
4:30:16 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:30:16 AM: *** System restart required ***
4:30:16 AM: focal: Pulling from netlify/build
4:30:16 AM: Digest: sha256:a53361ff11a8e42c6088aa85a401da4ac76b8c9bed96731e2f85536028417ef1
4:30:16 AM: Status: Image is up to date for netlify/build:focal
4:30:16 AM: docker.io/netlify/build:focal
4:30:19 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:30:20 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:30:20 AM: * Documentation: https://help.ubuntu.com
4:30:20 AM: * Management: https://landscape.canonical.com
4:30:20 AM: * Support: https://ubuntu.com/advantage
4:30:20 AM: System information as of Thu Feb 13 04:30:19 UTC 2025
4:30:20 AM: System load: 0.38
4:30:20 AM: Usage of /: 75.8% of 217.51GB
4:30:20 AM: Memory usage: 18%
4:30:20 AM: Swap usage: 1%
4:30:20 AM: Processes: 645
4:30:20 AM: Users logged in: 1
4:30:20 AM: IPv4 address for bond0: 147.75.199.15
4:30:20 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:30:20 AM: IPv4 address for docker0: 172.17.0.1
4:30:20 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:30:20 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:30:20 AM: 82 updates can be applied immediately.
4:30:20 AM: 7 of these updates are standard security updates.
4:30:20 AM: To see these additional updates run: apt list --upgradable
4:30:20 AM: New release '22.04.5 LTS' available.
4:30:20 AM: Run 'do-release-upgrade' to upgrade to it.
4:30:20 AM: 2 updates could not be installed automatically. For more details,
4:30:20 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:30:20 AM: *** System restart required ***
4:30:20 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:30:20 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:30:20 AM: * Documentation: https://help.ubuntu.com
4:30:20 AM: * Management: https://landscape.canonical.com
4:30:20 AM: * Support: https://ubuntu.com/advantage
4:30:20 AM: System information as of Thu Feb 13 04:30:19 UTC 2025
4:30:20 AM: System load: 0.38
4:30:20 AM: Usage of /: 75.8% of 217.51GB
4:30:20 AM: Memory usage: 18%
4:30:20 AM: Swap usage: 1%
4:30:20 AM: Processes: 645
4:30:20 AM: Users logged in: 1
4:30:20 AM: IPv4 address for bond0: 147.75.199.15
4:30:20 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:30:20 AM: IPv4 address for docker0: 172.17.0.1
4:30:20 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:30:20 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:30:20 AM: 82 updates can be applied immediately.
4:30:20 AM: 7 of these updates are standard security updates.
4:30:20 AM: To see these additional updates run: apt list --upgradable
4:30:20 AM: New release '22.04.5 LTS' available.
4:30:20 AM: Run 'do-release-upgrade' to upgrade to it.
4:30:20 AM: 2 updates could not be installed automatically. For more details,
4:30:20 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:30:20 AM: *** System restart required ***
4:30:20 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:30:21 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:30:21 AM: * Documentation: https://help.ubuntu.com
4:30:21 AM: * Management: https://landscape.canonical.com
4:30:21 AM: * Support: https://ubuntu.com/advantage
4:30:21 AM: System information as of Thu Feb 13 04:30:20 UTC 2025
4:30:21 AM: System load: 0.38
4:30:21 AM: Usage of /: 75.8% of 217.51GB
4:30:21 AM: Memory usage: 18%
4:30:21 AM: Swap usage: 1%
4:30:21 AM: Processes: 649
4:30:21 AM: Users logged in: 1
4:30:21 AM: IPv4 address for bond0: 147.75.199.15
4:30:21 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:30:21 AM: IPv4 address for docker0: 172.17.0.1
4:30:21 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:30:21 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:30:21 AM: 82 updates can be applied immediately.
4:30:21 AM: 7 of these updates are standard security updates.
4:30:21 AM: To see these additional updates run: apt list --upgradable
4:30:21 AM: New release '22.04.5 LTS' available.
4:30:21 AM: Run 'do-release-upgrade' to upgrade to it.
4:30:21 AM: 2 updates could not be installed automatically. For more details,
4:30:21 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:30:21 AM: *** System restart required ***
4:30:21 AM: /opt/buildhome/.nvm/nvm.sh
4:30:21 AM: .:
4:30:21 AM: ADOPTERS.md jest.config.js netlify.toml tools
4:30:21 AM: bin landscapes_dev package.json update_server
4:30:21 AM: build.sh landscapes.sh postcss.config.js _yarn.lock
4:30:21 AM: code-of-conduct.md landscapes.yml README.md yarn.lock
4:30:21 AM: files LICENSE server.js
4:30:21 AM: _headers netlify specs
4:30:21 AM: INSTALL.md netlify.md src
4:30:22 AM: v18.3
4:30:22 AM: Downloading and installing node v18.3.0...
4:30:22 AM: Downloading https://nodejs.org/dist/v18.3.0/node-v18.3.0-linux-x64.tar.xz...
4:30:22 AM: Computing checksum with sha256sum
4:30:23 AM: Checksums matched!
4:30:25 AM: Now using node v18.3.0 (npm v8.11.0)
4:30:26 AM: Now using node v18.3.0 (npm v8.11.0)
4:30:26 AM: npm WARN config
4:30:26 AM: global `--global`, `--local` are deprecated. Use `--location=global` instead.
4:30:26 AM:
4:30:27 AM: added 3 packages, and audited 4 packages in 495ms
4:30:27 AM: found 0 vulnerabilities
4:30:27 AM: npm
4:30:27 AM: WARN config global `--global`, `--local` are deprecated. Use `--location=global` instead.
4:30:27 AM:
4:30:29 AM: removed 12 packages, changed 105 packages, and audited 263 packages in 2s
4:30:29 AM: 27 packages are looking for funding
4:30:29 AM: run `npm fund` for details
4:30:29 AM: found 0 vulnerabilities
4:30:31 AM: added 1 package in 1s
4:30:31 AM: ➤ YN0000: ┌ Resolution step
4:30:32 AM: ➤ YN0000: └ Completed
4:30:32 AM: ➤ YN0000: ┌ Fetch step
4:30:37 AM: ➤ YN0013: │ 2 packages were already cached, 808 had to be fetched
4:30:37 AM: ➤ YN0000: └ Completed in 5s 619ms
4:30:37 AM: ➤ YN0000: ┌ Link step
4:30:38 AM: ➤ YN0000: │ ESM support for PnP uses the experimental loader API and is therefore experimental
4:30:38 AM: ➤ YN0007: │ puppeteer@npm:14.2.1 must be built because it never has been before or the last one failed
4:30:38 AM: ➤ YN0007: │ puppeteer@npm:13.2.0 must be built because it never has been before or the last one failed
4:30:38 AM: ➤ YN0007: │ yarn@npm:1.22.18 must be built because it never has been before or the last one failed
4:30:40 AM: ➤ YN0000: │ puppeteer@npm:13.2.0 STDERR
4:30:40 AM: ➤ YN0000: │ puppeteer@npm:14.2.1 STDERR
4:30:43 AM: ➤ YN0000: │ puppeteer@npm:13.2.0 STDOUT Chromium (961656) downloaded to /opt/repo/packageRemote/.yarn/unplugged/puppeteer-npm-13.2.0-5595d43df8/node_modules/puppeteer/.local-chromium/linux-961656
4:30:44 AM: ➤ YN0000: │ puppeteer@npm:14.2.1 STDOUT Chromium (1002410) downloaded to /opt/repo/packageRemote/.yarn/unplugged/puppeteer-npm-14.2.1-e2757bbf6c/node_modules/puppeteer/.local-chromium/linux-1002410
4:30:44 AM: ➤ YN0000: └ Completed in 6s 556ms
4:30:44 AM: ➤ YN0000: Done with warnings in 12s 581ms
4:30:46 AM: Processing the tree
4:30:49 AM: Running with a level=easy. Settings:
4:30:49 AM: Use cached crunchbase data: true
4:30:49 AM: Use cached images data: true
4:30:49 AM: Use cached twitter data: true
4:30:49 AM: Use cached github basic stats: true
4:30:49 AM: Use cached github start dates: true
4:30:49 AM: Use cached best practices: true
4:30:49 AM: Fetching crunchbase entries
4:30:50 AM: ................................................................................
4:30:50 AM: ................................................................................
4:30:50 AM: ................................................................................
4:30:50 AM: ................................................................................
4:30:50 AM: ................................................................................
4:30:50 AM: ....................................................**
4:30:50 AM: Fetching github entries
4:30:57 AM: ................................................................................
4:30:57 AM: ................................................................................
4:30:57 AM: ..................................*********************.........................
4:30:57 AM: ................................................................................
4:30:57 AM: ...................................................................*********
4:30:57 AM: Fetching start date entries
4:31:00 AM: ................................................................................
4:31:00 AM: ................................................................................
4:31:00 AM: ............................................***********.........................
4:31:00 AM: ................................................................................
4:31:00 AM: .........................................................*******************
4:31:00 AM: Fetching images
4:31:00 AM: got image entries
4:31:00 AM: Hash for Prefect is prefect-2
4:31:07 AM: ................................................................................
4:31:07 AM: ....**......**..................................................................
4:31:07 AM: ................................................................................
4:31:07 AM: ................................................................................
4:31:07 AM: ................................................................................
4:31:07 AM: ................................................................................
4:31:07 AM: Fetching last tweet dates
4:31:07 AM: Fetching best practices
4:31:07 AM: ................................................................................
4:31:07 AM: ................................................................................
4:31:07 AM: ................................................................................
4:31:07 AM: ................................................................................
4:31:07 AM: ...............................................
4:31:07 AM: Fetching CLOMonitor data
4:31:07 AM: Processing the tree
4:31:08 AM: saving!
4:31:10 AM: Hash for Prefect is prefect-2
4:31:10 AM: Warning: Open Platform for Enterprise AI (OPEA) has a twitter https://twitter.com/opeadev which is invalid or we just can not fetch its tweets
4:31:10 AM: Warning: Dragonfly has a twitter https://twitter.com/dragonfly_oss which is invalid or we just can not fetch its tweets
4:31:10 AM: Warning: Vald has a twitter https://twitter.com/vdaas_vald which is invalid or we just can not fetch its tweets
4:31:10 AM: Warning: FastTrackML has a twitter https://twitter.com/oss_gr which is invalid or we just can not fetch its tweets
4:31:10 AM: Warning: Mathesar has a twitter https://twitter.com/centerofci which is invalid or we just can not fetch its tweets
4:31:10 AM: Warning: RWKV has a twitter https://twitter.com/RWKV_AI which is invalid or we just can not fetch its tweets
4:31:10 AM: Warning: Langchain has a twitter https://twitter.com/LangChainAI which is invalid or we just can not fetch its tweets
4:31:10 AM: Warning: Haystack has a twitter https://twitter.com/deepset_ai which is invalid or we just can not fetch its tweets
4:31:10 AM: Warning: Armada has a twitter https://twitter.com/oss_gr which is invalid or we just can not fetch its tweets
4:31:10 AM: Warning: envd has a twitter https://twitter.com/TensorChord which is invalid or we just can not fetch its tweets
4:31:10 AM: Warning: NVIDIA Corporation has a twitter https://twitter.com/nvidia which is invalid or we just can not fetch its tweets
4:31:10 AM: Warning: Advanced Micro Devices (AMD) has a twitter https://twitter.com/AMD which is invalid or we just can not fetch its tweets
4:31:10 AM: Warning: Alibaba Cloud (Singapore) Private LTD has a twitter https://twitter.com/AlibabaB2B which is invalid or we just can not fetch its tweets
4:31:10 AM: Warning: Fujitsu Limited has a twitter https://twitter.com/Fujitsu_Global which is invalid or we just can not fetch its tweets
4:31:10 AM: Warning: GitCode has a twitter https://twitter.com/turingbook which is invalid or we just can not fetch its tweets
4:31:10 AM: Warning: Neo4j, Inc. has a twitter https://twitter.com/neo4j which is invalid or we just can not fetch its tweets
4:31:10 AM: Warning: Posit has a twitter https://twitter.com/posit_pbc which is invalid or we just can not fetch its tweets
4:31:10 AM: Warning: Snyk Limited has a twitter https://twitter.com/snyksec which is invalid or we just can not fetch its tweets
4:31:10 AM: Warning: AI for People has a twitter https://twitter.com/AIforPeople which is invalid or we just can not fetch its tweets
4:31:10 AM: Warning: Ambianic.ai has a twitter https://twitter.com/ambianicai which is invalid or we just can not fetch its tweets
4:31:10 AM: Warning: Columbia University has a twitter https://twitter.com/Columbia which is invalid or we just can not fetch its tweets
4:31:10 AM: Warning: Common Crawl Foundation has a twitter https://twitter.com/commoncrawl which is invalid or we just can not fetch its tweets
4:31:10 AM: Warning: Ersilia Open Source Initiative has a twitter https://twitter.com/ersiliaio which is invalid or we just can not fetch its tweets
4:31:10 AM: Warning: Ewha Institute for Biomedical Law & Ethics has a twitter https://twitter.com/cybercampus which is invalid or we just can not fetch its tweets
4:31:10 AM: Warning: Montreal AI Ethics Institute has a twitter https://twitter.com/mtlaiethics which is invalid or we just can not fetch its tweets
4:31:10 AM: Warning: NIPA has a twitter https://twitter.com/NIPAkr which is invalid or we just can not fetch its tweets
4:31:10 AM: Warning: One Fact Foundation has a twitter https://twitter.com/onefact_org which is invalid or we just can not fetch its tweets
4:31:10 AM: Warning: Pranveer Singh Institute Of Technology has a twitter https://twitter.com/PSITKanpur2004 which is invalid or we just can not fetch its tweets
4:31:10 AM: Warning: Rensselaer Polytechnic Institute has a twitter https://twitter.com/rpi which is invalid or we just can not fetch its tweets
4:31:10 AM: Warning: Transport for NSW has a twitter https://twitter.com/TransportforNSW which is invalid or we just can not fetch its tweets
4:31:10 AM: Warning: University of Michigan has a twitter https://twitter.com/UMich which is invalid or we just can not fetch its tweets
4:31:10 AM: Warning: Wikimedia Deutschland e. V. has a twitter https://twitter.com/WikimediaDE which is invalid or we just can not fetch its tweets
4:31:10 AM: Warning: World Ethical Data Foundation has a twitter https://twitter.com/WEDF_foundation which is invalid or we just can not fetch its tweets
4:31:10 AM: Warning: BasicAI(hosting) has a twitter https://twitter.com/BasicAIteam which is invalid or we just can not fetch its tweets
4:31:10 AM: Warning: Fujitsu Limited(hosting) has a twitter https://twitter.com/Fujitsu_Global which is invalid or we just can not fetch its tweets
4:31:10 AM: Warning: Owkin(hosting) has a twitter https://twitter.com/owkinscience which is invalid or we just can not fetch its tweets
4:31:11 AM: Fetching members from LF AI & Data Member Company category
4:31:11 AM: Processing the tree
4:31:11 AM: Assigning Premier membership on DataHub (LinkedIn) because its parent Microsoft has Premier membership
4:31:11 AM: Assigning Premier membership on Brooklin (LinkedIn) because its parent Microsoft has Premier membership
4:31:11 AM: Assigning Premier membership on Azkaban (LinkedIn) because its parent Microsoft has Premier membership
4:31:11 AM: Assigning Premier membership on LinkedIn (hosting) (LinkedIn) because its parent Microsoft has Premier membership
4:31:11 AM: Hash for Chaitanya Bharathi Institute of Technology is chaitanya-bharathi-institute-of-technology-2
4:31:11 AM: Hash for Fast.ai is fast-ai-2
4:31:11 AM: Hash for Great Expectations is great-expectations-2
4:31:11 AM: Hash for ML Perf is ml-perf-2
4:31:11 AM: Hash for PipelineAI is pipeline-ai-2
4:31:11 AM: Hash for Prefect is prefect-2
4:31:11 AM: Hash for Redash is redash-2
4:31:11 AM: Hash for Wikimedia Deutschland e.V. is wikimedia-deutschland-e-v-2
4:31:14 AM: {
4:31:14 AM: name: 'Accord.NET',
4:31:14 AM: homepage_url: 'http://accord-framework.net/',
4:31:14 AM: repo_url: 'https://github.com/accord-net/framework',
4:31:14 AM: logo: 'accord-net.svg',
4:31:14 AM: crunchbase: 'https://www.crunchbase.com/organization/accord-net-framework-project',
4:31:14 AM: github_data: {
4:31:14 AM: languages: [
4:31:14 AM: [Object], [Object],
4:31:14 AM: [Object], [Object],
4:31:14 AM: [Object], [Object],
4:31:14 AM: [Object], [Object],
4:31:14 AM: [Object], [Object],
4:31:14 AM: [Object], [Object],
4:31:14 AM: [Object], [Object],
4:31:14 AM: [Object]
4:31:14 AM: ],
4:31:14 AM: contributions: '0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0',
4:31:14 AM: firstWeek: '2022-11-27Z',
4:31:14 AM: stars: 4404,
4:31:14 AM: license: 'GNU Lesser General Public License v2.1',
4:31:14 AM: description: 'Machine learning, computer vision, statistics and general scientific computing for .NET',
4:31:14 AM: latest_commit_date: '2020-11-18T19:53:01Z',
4:31:14 AM: latest_commit_link: '/accord-net/framework/commit/1ab0cc0ba55bcc3d46f20e7bbe7224b58cd01854',
4:31:14 AM: release_date: '2017-10-19T21:00:56Z',
4:31:14 AM: release_link: 'https://github.com/accord-net/framework/releases',
4:31:14 AM: contributors_count: 98,
4:31:14 AM: contributors_link: 'https://github.com/accord-net/framework/graphs/contributors'
4:31:14 AM: },
4:31:14 AM: repos: [ { url: 'https://github.com/accord-net/framework', stars: 4404 } ],
4:31:14 AM: github_start_commit_data: {
4:31:14 AM: start_commit_link: '/accord-net/framework/commit/5e834efc89c071bb32c84f41124d5c9a5bbfe1e3',
4:31:14 AM: start_date: '2012-04-08T14:05:58Z'
4:31:14 AM: },
4:31:14 AM: image_data: {
4:31:14 AM: fileName: 'accord-net.svg',
4:31:14 AM: hash: 'tdvFWfo6hfZQqHqjscAtcoZWk6qztZFzBTVqKWUjYig='
4:31:14 AM: },
4:31:14 AM: firstCommitDate: '2012-04-08T14:05:58Z',
4:31:14 AM: firstCommitLink: 'https://github.com/accord-net/framework/commit/5e834efc89c071bb32c84f41124d5c9a5bbfe1e3',
4:31:14 AM: latestCommitDate: '2020-11-18T19:53:01Z',
4:31:14 AM: latestCommitLink: 'https://github.com/accord-net/framework/commit/1ab0cc0ba55bcc3d46f20e7bbe7224b58cd01854',
4:31:14 AM: releaseDate: '2017-10-19T21:00:56Z',
4:31:14 AM: releaseLink: 'https://github.com/accord-net/framework/releases',
4:31:14 AM: commitsThisYear: 0,
4:31:14 AM: contributorsCount: 98,
4:31:14 AM: contributorsLink: 'https://github.com/accord-net/framework/graphs/contributors',
4:31:14 AM: language: 'C#',
4:31:14 AM: stars: 4404,
4:31:14 AM: license: 'GNU Lesser General Public License v2.1',
4:31:14 AM: headquarters: 'Grenoble, France',
4:31:14 AM: description: 'Machine learning, computer vision, statistics and general scientific computing for .NET',
4:31:14 AM: organization: 'Accord.NET Framework',
4:31:14 AM: crunchbaseData: {
4:31:14 AM: name: 'Accord.NET Framework',
4:31:14 AM: description: 'Machine Learning Framework',
4:31:14 AM: homepage: 'http://accord-framework.net/',
4:31:14 AM: city: 'Grenoble',
4:31:14 AM: region: 'Rhone-Alpes',
4:31:14 AM: country: 'France',
4:31:14 AM: twitter: null,
4:31:14 AM: linkedin: null,
4:31:14 AM: acquisitions: [],
4:31:14 AM: parents: [],
4:31:14 AM: stockExchange: null,
4:31:14 AM: company_type: 'Non Profit',
4:31:14 AM: industries: [
4:31:14 AM: 'Analytics',
4:31:14 AM: 'Artificial Intelligence',
4:31:14 AM: 'Hardware',
4:31:14 AM: 'Machine Learning'
4:31:14 AM: ],
4:31:14 AM: numEmployeesMin: null,
4:31:14 AM: numEmployeesMax: null
4:31:14 AM: },
4:31:14 AM: path: 'Machine Learning / Framework',
4:31:14 AM: landscape: 'Machine Learning / Framework',
4:31:14 AM: category: 'Machine Learning',
4:31:14 AM: amount: 'N/A',
4:31:14 AM: oss: true,
4:31:14 AM: href: 'logos/accord-net.svg',
4:31:14 AM: bestPracticeBadgeId: false,
4:31:14 AM: bestPracticePercentage: null,
4:31:14 AM: industries: [
4:31:14 AM: 'Analytics',
4:31:14 AM: 'Artificial Intelligence',
4:31:14 AM: 'Hardware',
4:31:14 AM: 'Machine Learning'
4:31:14 AM: ],
4:31:14 AM: starsPresent: true,
4:31:14 AM: starsAsText: '4,404',
4:31:14 AM: marketCapPresent: false,
4:31:14 AM: marketCapAsText: 'N/A',
4:31:14 AM: id: 'accord-net',
4:31:14 AM: flatName: 'Accord.NET',
4:31:14 AM: member: false,
4:31:14 AM: relation: false,
4:31:14 AM: isSubsidiaryProject: false
4:31:14 AM: } 2020-11-18T19:53:01Z
4:31:14 AM: [
4:31:14 AM: 'Community Data License Agreement (CDLA)',
4:31:14 AM: 'PlaNet',
4:31:14 AM: 'Generic Neural Elastic Search (GNES)',
4:31:14 AM: 'PredictionIO',
4:31:14 AM: 'ELI5',
4:31:14 AM: 'BERT',
4:31:14 AM: 'Nauta',
4:31:14 AM: 'DAWNBench',
4:31:14 AM: 'AresDB',
4:31:14 AM: 'dotmesh',
4:31:14 AM: 'Audit AI',
4:31:14 AM: 'euler',
4:31:14 AM: 'Clipper',
4:31:14 AM: 'Accord.NET',
4:31:14 AM: 'Shogun',
4:31:14 AM: 'DELTA',
4:31:14 AM: 'BeakerX',
4:31:14 AM: 'PixieDust',
4:31:14 AM: 'TreeInterpreter',
4:31:14 AM: 'Cyclone',
4:31:14 AM: 'Lucid',
4:31:14 AM: 'XLM',
4:31:14 AM: 'Chainer RL',
4:31:14 AM: 'ForestFlow',
4:31:14 AM: 'uReplicator',
4:31:14 AM: 'Elastic Deep Learning (EDL)',
4:31:14 AM: 'Kashgari',
4:31:14 AM: 'X-DeepLearning',
4:31:14 AM: 'LIME',
4:31:14 AM: 'Model Asset eXchange (MAX)',
4:31:14 AM: 'TransmogrifAI',
4:31:14 AM: 'OpenBytes',
4:31:14 AM: 'DeepLIFT',
4:31:14 AM: 'Onepanel',
4:31:14 AM: 'DeepSpeech',
4:31:14 AM: 'Lucene',
4:31:14 AM: 'Turi Create',
4:31:14 AM: 'Visual Object Tagging Tool (VoTT)',
4:31:14 AM: 'Acumos',
4:31:14 AM: 'Skater',
4:31:14 AM: 'Catalyst',
4:31:14 AM: 'SKIP Language',
4:31:14 AM: 'SQLFlow',
4:31:14 AM: 'Advertorch',
4:31:14 AM: 'xLearn',
4:31:14 AM: 'Neuropod',
4:31:14 AM: 'AdvBox',
4:31:14 AM: 'RCloud',
4:31:14 AM: 'Neo-AI',
4:31:14 AM: 'Embedded Learning Library',
4:31:14 AM: 'Stable Baselines',
4:31:14 AM: 'talos',
4:31:14 AM: 'LabelImg',
4:31:14 AM: 'MMdnn',
4:31:14 AM: 'CNTK',
4:31:14 AM: 'Machine Learning eXchange',
4:31:14 AM: 'Singularity',
4:31:14 AM: 'Chainer',
4:31:14 AM: 'PyText',
4:31:14 AM: 'Pipeline.ai',
4:31:14 AM: 'Apache Bahir',
4:31:14 AM: 'NLP Architect',
4:31:14 AM: 'AllenNLP',
4:31:14 AM: 'Angel-ML',
4:31:14 AM: 'SEED RL',
4:31:14 AM: 'Coach',
4:31:14 AM: 'Gluon-NLP',
4:31:14 AM: 'DeepMind Lab',
4:31:14 AM: 'SEAL',
4:31:14 AM: 'MXNet',
4:31:14 AM: 'OpenAI Gym',
4:31:14 AM: 'MindMeld',
4:31:14 AM: 'CleverHans',
4:31:14 AM: 'Petastorm',
4:31:14 AM: 'Hawq',
4:31:14 AM: 'TF Encrypted',
4:31:14 AM: 'faust',
4:31:14 AM: 'Cortex',
4:31:14 AM: 'OpenDataology',
4:31:14 AM: 'YouTokenToMe',
4:31:14 AM: 'ALBERT',
4:31:14 AM: 'Adlik',
4:31:14 AM: '1chipML',
4:31:14 AM: 'Neural Network Distiller',
4:31:14 AM: 'Labelbox',
4:31:14 AM: 'Facets',
4:31:14 AM: 'OpenNN',
4:31:14 AM: 'Pilosa',
4:31:14 AM: 'Orchest',
4:31:14 AM: 'Model Server for Apache MXNet',
4:31:14 AM: 'LASER',
4:31:14 AM: 'Dopamine',
4:31:14 AM: 'MindSpore',
4:31:14 AM: 'HE Lib',
4:31:14 AM: 'd6tflow',
4:31:14 AM: 'Sonnet',
4:31:14 AM: 'Plaid ML',
4:31:14 AM: 'Nyoka',
4:31:14 AM: 'doccano',
4:31:14 AM: 'ecco',
4:31:14 AM: ... 252 more items
4:31:14 AM: ]
4:31:18 AM: ncc: Version 0.34.0
4:31:18 AM: ncc: Compiling file index.js into CJS
4:31:20 AM: ncc: Version 0.34.0
4:31:20 AM: ncc: Compiling file index.js into CJS
4:31:20 AM: ncc: Version 0.34.0
4:31:20 AM: ncc: Compiling file index.js into CJS
4:31:24 AM: Development server running at http://127.0.0.1:4000/
4:31:34 AM: Running integration-test,check-landscape,render-landscape,funding in parallel
4:31:36 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:31:38 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:31:39 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:31:40 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:31:41 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:31:42 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:31:43 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:31:44 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:31:44 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:31:45 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:31:45 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:31:46 AM: api request starting... /api/ids?category=&project=&license=&organization=accord-net-framework&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:31:46 AM: api request starting... /api/ids?category=&project=&license=&organization=microsoft&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:31:56 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:32:17 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:32:37 AM: Task: integration-test PASS specs/main.spec.js (9.955s)
4:32:37 AM: Main test
4:32:37 AM: I visit a main page and have all required elements
4:32:37 AM: ✓ I can open a page (1622ms)
4:32:37 AM: ✓ A proper header is present (6ms)
4:32:37 AM: ✓ Group headers are ok (2ms)
4:32:37 AM: ✓ I see a You are viewing text (2ms)
4:32:37 AM: ✓ A proper card is present (3ms)
4:32:37 AM: ✓ If I click on a card, I see a modal dialog (352ms)
4:32:37 AM: ✓ Closing a browser (28ms)
4:32:37 AM: Landscape Test
4:32:37 AM: ✓ I visit LF AI & Data Members landscape page and have all required elements, elements are clickable (934ms)
4:32:37 AM: ✓ Closing a browser (22ms)
4:32:37 AM: ✓ I visit Companies Hosting Projects landscape page and have all required elements, elements are clickable (901ms)
4:32:37 AM: ✓ Closing a browser (22ms)
4:32:37 AM: I visit a main landscape page and have all required elements
4:32:37 AM: ✓ I open a landscape page and wait for it to load (1853ms)
4:32:37 AM: ✓ When I click on an item the modal is open (72ms)
4:32:37 AM: ✓ If I would straight open the url with a selected id, a modal appears (2015ms)
4:32:37 AM: ✓ Closing a browser (33ms)
4:32:37 AM: Filtering by organization
4:32:37 AM: ✓ Checking we see Accord.NET when filtering by organization Accord.NET Framework (733ms)
4:32:37 AM: ✓ Checking we don't see Accord.NET when filtering by organization Microsoft (301ms)
✓ Closing a browser (25ms)
✓ Closing a browser (25ms)
4:32:37 AM: PASS specs/tools/actualTwitter.spec.js
4:32:37 AM: Twitter URL
4:32:37 AM: when crunchbase data not set
4:32:37 AM: ✓ returns URL from node (2ms)
4:32:37 AM: when node does not have twitter URL
4:32:37 AM: ✓ returns URL from node
4:32:37 AM: when node has twitter URL set to null
4:32:37 AM: ✓ returns undefined (1ms)
4:32:37 AM: when both node and crunchbase have twitter URL
4:32:37 AM: ✓ returns URL from node
4:32:37 AM: when twitter URL is not set anywhere
4:32:37 AM: ✓ returns undefined
4:32:37 AM: cleaning up twitter URL
4:32:37 AM: ✓ replaces http with https
4:32:37 AM: ✓ removes www
4:32:37 AM: ✓ query string (1ms)
4:32:37 AM: Test Suites: 2 passed, 2 total
4:32:37 AM: Tests: 26 passed, 26 total
4:32:37 AM: Snapshots: 0 total
4:32:37 AM: Time: 10.187s
4:32:37 AM: Task: check-landscape
4:32:37 AM: Task: render-landscape visiting http://localhost:4000/fullscreen?version=2025-02-13T04:31:35Z 6a7287a&scale=false&pdf
4:32:37 AM: visiting http://localhost:4000/fullscreen/members?version=2025-02-13T04:31:35Z 6a7287a&scale=false&pdf
4:32:37 AM: visiting http://localhost:4000/fullscreen/hosting?version=2025-02-13T04:31:35Z 6a7287a&scale=false&pdf
4:32:37 AM: visiting http://localhost:4000/fullscreen?version=2025-02-13T04:31:35Z 6a7287a&scale=false&pdf
4:32:37 AM: visiting http://localhost:4000/fullscreen/members?version=2025-02-13T04:31:35Z 6a7287a&scale=false&pdf
4:32:37 AM: visiting http://localhost:4000/fullscreen/hosting?version=2025-02-13T04:31:35Z 6a7287a&scale=false&pdf
4:32:37 AM: Task: funding From https://github.com/lfai/lfai-landscape
4:32:37 AM: * [new branch] NSouthernLF-patch-1 -> github/NSouthernLF-patch-1
4:32:37 AM: * [new branch] NSouthernLF-patch-2 -> github/NSouthernLF-patch-2
4:32:37 AM: * [new branch] NSouthernLF-patch-3 -> github/NSouthernLF-patch-3
4:32:37 AM: * [new branch] NSouthernLF-patch-3-1 -> github/NSouthernLF-patch-3-1
4:32:37 AM: * [new branch] create-pull-request/patch-1693628094 -> github/create-pull-request/patch-1693628094
4:32:37 AM: * [new branch] create-pull-request/patch-1693714471 -> github/create-pull-request/patch-1693714471
4:32:37 AM: * [new branch] create-pull-request/patch-1703668428 -> github/create-pull-request/patch-1703668428
4:32:37 AM: * [new branch] main -> github/main
4:32:37 AM: * [new branch] revert-303-main -> github/revert-303-main
4:32:37 AM: * [new branch] web-landscape-2021-10-01T22-18 -> github/web-landscape-2021-10-01T22-18
4:32:37 AM: * [new branch] web-landscape-2021-10-12T16-06 -> github/web-landscape-2021-10-12T16-06
4:32:37 AM: * [new branch] web-landscape-2021-10-18T19-46 -> github/web-landscape-2021-10-18T19-46
4:32:37 AM: * [new branch] web-landscape-2021-10-20T18-58 -> github/web-landscape-2021-10-20T18-58
4:32:40 AM: Output from remote build, exit code: 0
4:32:40 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:32:40 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:32:40 AM: * Documentation: https://help.ubuntu.com
4:32:40 AM: * Management: https://landscape.canonical.com
4:32:40 AM: * Support: https://ubuntu.com/advantage
4:32:40 AM: System information as of Thu Feb 13 04:30:20 UTC 2025
4:32:40 AM: System load: 0.38
4:32:40 AM: Usage of /: 75.8% of 217.51GB
4:32:40 AM: Memory usage: 18%
4:32:40 AM: Swap usage: 1%
4:32:40 AM: Processes: 649
4:32:40 AM: Users logged in: 1
4:32:40 AM: IPv4 address for bond0: 147.75.199.15
4:32:40 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:32:40 AM: IPv4 address for docker0: 172.17.0.1
4:32:40 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:32:40 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:32:40 AM: 82 updates can be applied immediately.
4:32:40 AM: 7 of these updates are standard security updates.
4:32:40 AM: To see these additional updates run: apt list --upgradable
4:32:40 AM: New release '22.04.5 LTS' available.
4:32:40 AM: Run 'do-release-upgrade' to upgrade to it.
4:32:40 AM: 2 updates could not be installed automatically. For more details,
4:32:40 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:32:40 AM: *** System restart required ***
4:32:40 AM: /opt/buildhome/.nvm/nvm.sh
4:32:40 AM: .:
4:32:40 AM: ADOPTERS.md jest.config.js netlify.toml tools
4:32:40 AM: bin landscapes_dev package.json update_server
4:32:40 AM: build.sh landscapes.sh postcss.config.js _yarn.lock
4:32:40 AM: code-of-conduct.md landscapes.yml README.md yarn.lock
4:32:40 AM: files LICENSE server.js
4:32:40 AM: _headers netlify specs
4:32:40 AM: INSTALL.md netlify.md src
4:32:40 AM: v18.3
4:32:40 AM: Downloading and installing node v18.3.0...
4:32:40 AM: Downloading https://nodejs.org/dist/v18.3.0/node-v18.3.0-linux-x64.tar.xz...
4:32:40 AM: Computing checksum with sha256sum
4:32:40 AM: Checksums matched!
4:32:40 AM: Now using node v18.3.0 (npm v8.11.0)
4:32:40 AM: Now using node v18.3.0 (npm v8.11.0)
4:32:40 AM: npm WARN config global `--global`, `--local` are deprecated. Use `--location=global` instead.
4:32:40 AM:
4:32:40 AM: added 3 packages, and audited 4 packages in 495ms
4:32:40 AM: found 0 vulnerabilities
4:32:40 AM: npm WARN config global `--global`, `--local` are deprecated. Use `--location=global` instead.
4:32:40 AM:
4:32:40 AM: removed 12 packages, changed 105 packages, and audited 263 packages in 2s
4:32:40 AM: 27 packages are looking for funding
4:32:40 AM: run `npm fund` for details
4:32:40 AM: found 0 vulnerabilities
4:32:40 AM: added 1 package in 1s
4:32:40 AM: ➤ YN0000: ┌ Resolution step
4:32:40 AM: ➤ YN0000: └ Completed
4:32:40 AM: ➤ YN0000: ┌ Fetch step
4:32:40 AM: ➤ YN0013: │ 2 packages were already cached, 808 had to be fetched
4:32:40 AM: ➤ YN0000: └ Completed in 5s 619ms
4:32:40 AM: ➤ YN0000: ┌ Link step
4:32:40 AM: ➤ YN0000: │ ESM support for PnP uses the experimental loader API and is therefore experimental
4:32:40 AM: ➤ YN0007: │ puppeteer@npm:14.2.1 must be built because it never has been before or the last one failed
4:32:40 AM: ➤ YN0007: │ puppeteer@npm:13.2.0 must be built because it never has been before or the last one failed
4:32:40 AM: ➤ YN0007: │ yarn@npm:1.22.18 must be built because it never has been before or the last one failed
4:32:40 AM: ➤ YN0000: │ puppeteer@npm:13.2.0 STDERR
4:32:40 AM: ➤ YN0000: │ puppeteer@npm:14.2.1 STDERR
4:32:40 AM: ➤ YN0000: │ puppeteer@npm:13.2.0 STDOUT Chromium (961656) downloaded to /opt/repo/packageRemote/.yarn/unplugged/puppeteer-npm-13.2.0-5595d43df8/node_modules/puppeteer/.local-chromium/linux-961656
4:32:40 AM: ➤ YN0000: │ puppeteer@npm:14.2.1 STDOUT Chromium (1002410) downloaded to /opt/repo/packageRemote/.yarn/unplugged/puppeteer-npm-14.2.1-e2757bbf6c/node_modules/puppeteer/.local-chromium/linux-1002410
4:32:40 AM: ➤ YN0000: └ Completed in 6s 556ms
4:32:40 AM: ➤ YN0000: Done with warnings in 12s 581ms
4:32:40 AM: Processing the tree
4:32:40 AM: Running with a level=easy. Settings:
4:32:40 AM: Use cached crunchbase data: true
4:32:40 AM: Use cached images data: true
4:32:40 AM: Use cached twitter data: true
4:32:40 AM: Use cached github basic stats: true
4:32:40 AM: Use cached github start dates: true
4:32:40 AM: Use cached best practices: true
4:32:40 AM: Fetching crunchbase entries
4:32:40 AM: ................................................................................
4:32:40 AM: ................................................................................
4:32:40 AM: ................................................................................
4:32:40 AM: ................................................................................
4:32:40 AM: ................................................................................
4:32:40 AM: ....................................................**
4:32:40 AM: Fetching github entries
4:32:40 AM: ................................................................................
4:32:40 AM: ................................................................................
4:32:40 AM: ..................................*********************.........................
4:32:40 AM: ................................................................................
4:32:40 AM: ...................................................................*********
4:32:40 AM: Fetching start date entries
4:32:40 AM: ................................................................................
4:32:40 AM: ................................................................................
4:32:40 AM: ............................................***********.........................
4:32:40 AM: ................................................................................
4:32:40 AM: .........................................................*******************
4:32:40 AM: Fetching images
4:32:40 AM: got image entries
4:32:40 AM: Hash for Prefect is prefect-2
4:32:40 AM: ................................................................................
4:32:40 AM: ....**......**..................................................................
4:32:40 AM: ................................................................................
4:32:40 AM: ................................................................................
4:32:40 AM: ................................................................................
4:32:40 AM: ................................................................................
4:32:40 AM: Fetching last tweet dates
4:32:40 AM: Fetching best practices
4:32:40 AM: ................................................................................
4:32:40 AM: ................................................................................
4:32:40 AM: ................................................................................
4:32:40 AM: ................................................................................
4:32:40 AM: ...............................................
4:32:40 AM: Fetching CLOMonitor data
4:32:40 AM: Processing the tree
4:32:40 AM: saving!
4:32:40 AM: Hash for Prefect is prefect-2
4:32:40 AM: Warning: Open Platform for Enterprise AI (OPEA) has a twitter https://twitter.com/opeadev which is invalid or we just can not fetch its tweets
4:32:40 AM: Warning: Dragonfly has a twitter https://twitter.com/dragonfly_oss which is invalid or we just can not fetch its tweets
4:32:40 AM: Warning: Vald has a twitter https://twitter.com/vdaas_vald which is invalid or we just can not fetch its tweets
4:32:40 AM: Warning: FastTrackML has a twitter https://twitter.com/oss_gr which is invalid or we just can not fetch its tweets
4:32:40 AM: Warning: Mathesar has a twitter https://twitter.com/centerofci which is invalid or we just can not fetch its tweets
4:32:40 AM: Warning: RWKV has a twitter https://twitter.com/RWKV_AI which is invalid or we just can not fetch its tweets
4:32:40 AM: Warning: Langchain has a twitter https://twitter.com/LangChainAI which is invalid or we just can not fetch its tweets
4:32:40 AM: Warning: Haystack has a twitter https://twitter.com/deepset_ai which is invalid or we just can not fetch its tweets
4:32:40 AM: Warning: Armada has a twitter https://twitter.com/oss_gr which is invalid or we just can not fetch its tweets
4:32:40 AM: Warning: envd has a twitter https://twitter.com/TensorChord which is invalid or we just can not fetch its tweets
4:32:40 AM: Warning: NVIDIA Corporation has a twitter https://twitter.com/nvidia which is invalid or we just can not fetch its tweets
4:32:40 AM: Warning: Advanced Micro Devices (AMD) has a twitter https://twitter.com/AMD which is invalid or we just can not fetch its tweets
4:32:40 AM: Warning: Alibaba Cloud (Singapore) Private LTD has a twitter https://twitter.com/AlibabaB2B which is invalid or we just can not fetch its tweets
4:32:40 AM: Warning: Fujitsu Limited has a twitter https://twitter.com/Fujitsu_Global which is invalid or we just can not fetch its tweets
4:32:40 AM: Warning: GitCode has a twitter https://twitter.com/turingbook which is invalid or we just can not fetch its tweets
4:32:40 AM: Warning: Neo4j, Inc. has a twitter https://twitter.com/neo4j which is invalid or we just can not fetch its tweets
4:32:40 AM: Warning: Posit has a twitter https://twitter.com/posit_pbc which is invalid or we just can not fetch its tweets
4:32:40 AM: Warning: Snyk Limited has a twitter https://twitter.com/snyksec which is invalid or we just can not fetch its tweets
4:32:40 AM: Warning: AI for People has a twitter https://twitter.com/AIforPeople which is invalid or we just can not fetch its tweets
4:32:40 AM: Warning: Ambianic.ai has a twitter https://twitter.com/ambianicai which is invalid or we just can not fetch its tweets
4:32:40 AM: Warning: Columbia University has a twitter https://twitter.com/Columbia which is invalid or we just can not fetch its tweets
4:32:40 AM: Warning: Common Crawl Foundation has a twitter https://twitter.com/commoncrawl which is invalid or we just can not fetch its tweets
4:32:40 AM: Warning: Ersilia Open Source Initiative has a twitter https://twitter.com/ersiliaio which is invalid or we just can not fetch its tweets
4:32:40 AM: Warning: Ewha Institute for Biomedical Law & Ethics has a twitter https://twitter.com/cybercampus which is invalid or we just can not fetch its tweets
4:32:40 AM: Warning: Montreal AI Ethics Institute has a twitter https://twitter.com/mtlaiethics which is invalid or we just can not fetch its tweets
4:32:40 AM: Warning: NIPA has a twitter https://twitter.com/NIPAkr which is invalid or we just can not fetch its tweets
4:32:40 AM: Warning: One Fact Foundation has a twitter https://twitter.com/onefact_org which is invalid or we just can not fetch its tweets
4:32:40 AM: Warning: Pranveer Singh Institute Of Technology has a twitter https://twitter.com/PSITKanpur2004 which is invalid or we just can not fetch its tweets
4:32:40 AM: Warning: Rensselaer Polytechnic Institute has a twitter https://twitter.com/rpi which is invalid or we just can not fetch its tweets
4:32:40 AM: Warning: Transport for NSW has a twitter https://twitter.com/TransportforNSW which is invalid or we just can not fetch its tweets
4:32:40 AM: Warning: University of Michigan has a twitter https://twitter.com/UMich which is invalid or we just can not fetch its tweets
4:32:40 AM: Warning: Wikimedia Deutschland e. V. has a twitter https://twitter.com/WikimediaDE which is invalid or we just can not fetch its tweets
4:32:40 AM: Warning: World Ethical Data Foundation has a twitter https://twitter.com/WEDF_foundation which is invalid or we just can not fetch its tweets
4:32:40 AM: Warning: BasicAI(hosting) has a twitter https://twitter.com/BasicAIteam which is invalid or we just can not fetch its tweets
4:32:40 AM: Warning: Fujitsu Limited(hosting) has a twitter https://twitter.com/Fujitsu_Global which is invalid or we just can not fetch its tweets
4:32:40 AM: Warning: Owkin(hosting) has a twitter https://twitter.com/owkinscience which is invalid or we just can not fetch its tweets
4:32:40 AM: Fetching members from LF AI & Data Member Company category
4:32:40 AM: Processing the tree
4:32:40 AM: Assigning Premier membership on DataHub (LinkedIn) because its parent Microsoft has Premier membership
4:32:40 AM: Assigning Premier membership on Brooklin (LinkedIn) because its parent Microsoft has Premier membership
4:32:40 AM: Assigning Premier membership on Azkaban (LinkedIn) because its parent Microsoft has Premier membership
4:32:40 AM: Assigning Premier membership on LinkedIn (hosting) (LinkedIn) because its parent Microsoft has Premier membership
4:32:40 AM: Hash for Chaitanya Bharathi Institute of Technology is chaitanya-bharathi-institute-of-technology-2
4:32:40 AM: Hash for Fast.ai is fast-ai-2
4:32:40 AM: Hash for Great Expectations is great-expectations-2
4:32:40 AM: Hash for ML Perf is ml-perf-2
4:32:40 AM: Hash for PipelineAI is pipeline-ai-2
4:32:40 AM: Hash for Prefect is prefect-2
4:32:40 AM: Hash for Redash is redash-2
4:32:40 AM: Hash for Wikimedia Deutschland e.V. is wikimedia-deutschland-e-v-2
4:32:40 AM: {
4:32:40 AM: name: 'Accord.NET',
4:32:40 AM: homepage_url: 'http://accord-framework.net/',
4:32:40 AM: repo_url: 'https://github.com/accord-net/framework',
4:32:40 AM: logo: 'accord-net.svg',
4:32:40 AM: crunchbase: 'https://www.crunchbase.com/organization/accord-net-framework-project',
4:32:40 AM: github_data: {
4:32:40 AM: languages: [
4:32:40 AM: [Object], [Object],
4:32:40 AM: [Object], [Object],
4:32:40 AM: [Object], [Object],
4:32:40 AM: [Object], [Object],
4:32:40 AM: [Object], [Object],
4:32:40 AM: [Object], [Object],
4:32:40 AM: [Object], [Object],
4:32:40 AM: [Object]
4:32:40 AM: ],
4:32:40 AM: contributions: '0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0',
4:32:40 AM: firstWeek: '2022-11-27Z',
4:32:40 AM: stars: 4404,
4:32:40 AM: license: 'GNU Lesser General Public License v2.1',
4:32:40 AM: description: 'Machine learning, computer vision, statistics and general scientific computing for .NET',
4:32:40 AM: latest_commit_date: '2020-11-18T19:53:01Z',
4:32:40 AM: latest_commit_link: '/accord-net/framework/commit/1ab0cc0ba55bcc3d46f20e7bbe7224b58cd01854',
4:32:40 AM: release_date: '2017-10-19T21:00:56Z',
4:32:40 AM: release_link: 'https://github.com/accord-net/framework/releases',
4:32:40 AM: contributors_count: 98,
4:32:40 AM: contributors_link: 'https://github.com/accord-net/framework/graphs/contributors'
4:32:40 AM: },
4:32:40 AM: repos: [ { url: 'https://github.com/accord-net/framework', stars: 4404 } ],
4:32:40 AM: github_start_commit_data: {
4:32:40 AM: start_commit_link: '/accord-net/framework/commit/5e834efc89c071bb32c84f41124d5c9a5bbfe1e3',
4:32:40 AM: start_date: '2012-04-08T14:05:58Z'
4:32:40 AM: },
4:32:40 AM: image_data: {
4:32:40 AM: fileName: 'accord-net.svg',
4:32:40 AM: hash: 'tdvFWfo6hfZQqHqjscAtcoZWk6qztZFzBTVqKWUjYig='
4:32:40 AM: },
4:32:40 AM: firstCommitDate: '2012-04-08T14:05:58Z',
4:32:40 AM: firstCommitLink: 'https://github.com/accord-net/framework/commit/5e834efc89c071bb32c84f41124d5c9a5bbfe1e3',
4:32:40 AM: latestCommitDate: '2020-11-18T19:53:01Z',
4:32:40 AM: latestCommitLink: 'https://github.com/accord-net/framework/commit/1ab0cc0ba55bcc3d46f20e7bbe7224b58cd01854',
4:32:40 AM: releaseDate: '2017-10-19T21:00:56Z',
4:32:40 AM: releaseLink: 'https://github.com/accord-net/framework/releases',
4:32:40 AM: commitsThisYear: 0,
4:32:40 AM: contributorsCount: 98,
4:32:40 AM: contributorsLink: 'https://github.com/accord-net/framework/graphs/contributors',
4:32:40 AM: language: 'C#',
4:32:40 AM: stars: 4404,
4:32:40 AM: license: 'GNU Lesser General Public License v2.1',
4:32:40 AM: headquarters: 'Grenoble, France',
4:32:40 AM: description: 'Machine learning, computer vision, statistics and general scientific computing for .NET',
4:32:40 AM: organization: 'Accord.NET Framework',
4:32:40 AM: crunchbaseData: {
4:32:40 AM: name: 'Accord.NET Framework',
4:32:40 AM: description: 'Machine Learning Framework',
4:32:40 AM: homepage: 'http://accord-framework.net/',
4:32:40 AM: city: 'Grenoble',
4:32:40 AM: region: 'Rhone-Alpes',
4:32:40 AM: country: 'France',
4:32:40 AM: twitter: null,
4:32:40 AM: linkedin: null,
4:32:40 AM: acquisitions: [],
4:32:40 AM: parents: [],
4:32:40 AM: stockExchange: null,
4:32:40 AM: company_type: 'Non Profit',
4:32:40 AM: industries: [
4:32:40 AM: 'Analytics',
4:32:40 AM: 'Artificial Intelligence',
4:32:40 AM: 'Hardware',
4:32:40 AM: 'Machine Learning'
4:32:40 AM: ],
4:32:40 AM: numEmployeesMin: null,
4:32:40 AM: numEmployeesMax: null
4:32:40 AM: },
4:32:40 AM: path: 'Machine Learning / Framework',
4:32:40 AM: landscape: 'Machine Learning / Framework',
4:32:40 AM: category: 'Machine Learning',
4:32:40 AM: amount: 'N/A',
4:32:40 AM: oss: true,
4:32:40 AM: href: 'logos/accord-net.svg',
4:32:40 AM: bestPracticeBadgeId: false,
4:32:40 AM: bestPracticePercentage: null,
4:32:40 AM: industries: [
4:32:40 AM: 'Analytics',
4:32:40 AM: 'Artificial Intelligence',
4:32:40 AM: 'Hardware',
4:32:40 AM: 'Machine Learning'
4:32:40 AM: ],
4:32:40 AM: starsPresent: true,
4:32:40 AM: starsAsText: '4,404',
4:32:40 AM: marketCapPresent: false,
4:32:40 AM: marketCapAsText: 'N/A',
4:32:40 AM: id: 'accord-net',
4:32:40 AM: flatName: 'Accord.NET',
4:32:40 AM: member: false,
4:32:40 AM: relation: false,
4:32:40 AM: isSubsidiaryProject: false
4:32:40 AM: } 2020-11-18T19:53:01Z
4:32:40 AM: [
4:32:40 AM: 'Community Data License Agreement (CDLA)',
4:32:40 AM: 'PlaNet',
4:32:40 AM: 'Generic Neural Elastic Search (GNES)',
4:32:40 AM: 'PredictionIO',
4:32:40 AM: 'ELI5',
4:32:40 AM: 'BERT',
4:32:40 AM: 'Nauta',
4:32:40 AM: 'DAWNBench',
4:32:40 AM: 'AresDB',
4:32:40 AM: 'dotmesh',
4:32:40 AM: 'Audit AI',
4:32:40 AM: 'euler',
4:32:40 AM: 'Clipper',
4:32:40 AM: 'Accord.NET',
4:32:40 AM: 'Shogun',
4:32:40 AM: 'DELTA',
4:32:40 AM: 'BeakerX',
4:32:40 AM: 'PixieDust',
4:32:40 AM: 'TreeInterpreter',
4:32:40 AM: 'Cyclone',
4:32:40 AM: 'Lucid',
4:32:40 AM: 'XLM',
4:32:40 AM: 'Chainer RL',
4:32:40 AM: 'ForestFlow',
4:32:40 AM: 'uReplicator',
4:32:40 AM: 'Elastic Deep Learning (EDL)',
4:32:40 AM: 'Kashgari',
4:32:40 AM: 'X-DeepLearning',
4:32:40 AM: 'LIME',
4:32:40 AM: 'Model Asset eXchange (MAX)',
4:32:40 AM: 'TransmogrifAI',
4:32:40 AM: 'OpenBytes',
4:32:40 AM: 'DeepLIFT',
4:32:40 AM: 'Onepanel',
4:32:40 AM: 'DeepSpeech',
4:32:40 AM: 'Lucene',
4:32:40 AM: 'Turi Create',
4:32:40 AM: 'Visual Object Tagging Tool (VoTT)',
4:32:40 AM: 'Acumos',
4:32:40 AM: 'Skater',
4:32:40 AM: 'Catalyst',
4:32:40 AM: 'SKIP Language',
4:32:40 AM: 'SQLFlow',
4:32:40 AM: 'Advertorch',
4:32:40 AM: 'xLearn',
4:32:40 AM: 'Neuropod',
4:32:40 AM: 'AdvBox',
4:32:40 AM: 'RCloud',
4:32:40 AM: 'Neo-AI',
4:32:40 AM: 'Embedded Learning Library',
4:32:40 AM: 'Stable Baselines',
4:32:40 AM: 'talos',
4:32:40 AM: 'LabelImg',
4:32:40 AM: 'MMdnn',
4:32:40 AM: 'CNTK',
4:32:40 AM: 'Machine Learning eXchange',
4:32:40 AM: 'Singularity',
4:32:40 AM: 'Chainer',
4:32:40 AM: 'PyText',
4:32:40 AM: 'Pipeline.ai',
4:32:40 AM: 'Apache Bahir',
4:32:40 AM: 'NLP Architect',
4:32:40 AM: 'AllenNLP',
4:32:40 AM: 'Angel-ML',
4:32:40 AM: 'SEED RL',
4:32:40 AM: 'Coach',
4:32:40 AM: 'Gluon-NLP',
4:32:40 AM: 'DeepMind Lab',
4:32:40 AM: 'SEAL',
4:32:40 AM: 'MXNet',
4:32:40 AM: 'OpenAI Gym',
4:32:40 AM: 'MindMeld',
4:32:40 AM: 'CleverHans',
4:32:40 AM: 'Petastorm',
4:32:40 AM: 'Hawq',
4:32:40 AM: 'TF Encrypted',
4:32:40 AM: 'faust',
4:32:40 AM: 'Cortex',
4:32:40 AM: 'OpenDataology',
4:32:40 AM: 'YouTokenToMe',
4:32:40 AM: 'ALBERT',
4:32:40 AM: 'Adlik',
4:32:40 AM: '1chipML',
4:32:40 AM: 'Neural Network Distiller',
4:32:40 AM: 'Labelbox',
4:32:40 AM: 'Facets',
4:32:40 AM: 'OpenNN',
4:32:40 AM: 'Pilosa',
4:32:40 AM: 'Orchest',
4:32:40 AM: 'Model Server for Apache MXNet',
4:32:40 AM: 'LASER',
4:32:40 AM: 'Dopamine',
4:32:40 AM: 'MindSpore',
4:32:40 AM: 'HE Lib',
4:32:40 AM: 'd6tflow',
4:32:40 AM: 'Sonnet',
4:32:40 AM: 'Plaid ML',
4:32:40 AM: 'Nyoka',
4:32:40 AM: 'doccano',
4:32:40 AM: 'ecco',
4:32:40 AM: ... 252 more items
4:32:40 AM: ]
4:32:40 AM: ncc: Version 0.34.0
4:32:40 AM: ncc: Compiling file index.js into CJS
4:32:40 AM: ncc: Version 0.34.0
4:32:40 AM: ncc: Compiling file index.js into CJS
4:32:40 AM: ncc: Version 0.34.0
4:32:40 AM: ncc: Compiling file index.js into CJS
4:32:40 AM: Development server running at http://127.0.0.1:4000/
4:32:40 AM: Running integration-test,check-landscape,render-landscape,funding in parallel
4:32:40 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:32:40 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:32:40 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:32:40 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:32:40 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:32:40 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:32:40 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:32:40 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:32:40 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:32:40 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:32:40 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:32:40 AM: api request starting... /api/ids?category=&project=&license=&organization=accord-net-framework&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:32:40 AM: api request starting... /api/ids?category=&project=&license=&organization=microsoft&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:32:40 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:32:40 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:32:40 AM: Task: integration-test PASS specs/main.spec.js (9.955s)
4:32:40 AM: Main test
4:32:40 AM: I visit a main page and have all required elements
4:32:40 AM: ✓ I can open a page (1622ms)
4:32:40 AM: ✓ A proper header is present (6ms)
4:32:40 AM: ✓ Group headers are ok (2ms)
4:32:40 AM: ✓ I see a You are viewing text (2ms)
4:32:40 AM: ✓ A proper card is present (3ms)
4:32:40 AM: ✓ If I click on a card, I see a modal dialog (352ms)
4:32:40 AM: ✓ Closing a browser (28ms)
4:32:40 AM: Landscape Test
4:32:40 AM: ✓ I visit LF AI & Data Members landscape page and have all required elements, elements are clickable (934ms)
4:32:40 AM: ✓ Closing a browser (22ms)
4:32:40 AM: ✓ I visit Companies Hosting Projects landscape page and have all required elements, elements are clickable (901ms)
4:32:40 AM: ✓ Closing a browser (22ms)
4:32:40 AM: I visit a main landscape page and have all required elements
4:32:40 AM: ✓ I open a landscape page and wait for it to load (1853ms)
4:32:40 AM: ✓ When I click on an item the modal is open (72ms)
4:32:40 AM: ✓ If I would straight open the url with a selected id, a modal appears (2015ms)
4:32:40 AM: ✓ Closing a browser (33ms)
4:32:40 AM: Filtering by organization
4:32:40 AM: ✓ Checking we see Accord.NET when filtering by organization Accord.NET Framework (733ms)
4:32:40 AM: ✓ Checking we don't see Accord.NET when filtering by organization Microsoft (301ms)
✓ Closing a browser (25ms)
✓ Closing a browser (25ms)
4:32:40 AM: PASS specs/tools/actualTwitter.spec.js
4:32:40 AM: Twitter URL
4:32:40 AM: when crunchbase data not set
4:32:40 AM: ✓ returns URL from node (2ms)
4:32:40 AM: when node does not have twitter URL
4:32:40 AM: ✓ returns URL from node
4:32:40 AM: when node has twitter URL set to null
4:32:40 AM: ✓ returns undefined (1ms)
4:32:40 AM: when both node and crunchbase have twitter URL
4:32:40 AM: ✓ returns URL from node
4:32:40 AM: when twitter URL is not set anywhere
4:32:40 AM: ✓ returns undefined
4:32:40 AM: cleaning up twitter URL
4:32:40 AM: ✓ replaces http with https
4:32:40 AM: ✓ removes www
4:32:40 AM: ✓ query string (1ms)
4:32:40 AM: Test Suites: 2 passed, 2 total
4:32:40 AM: Tests: 26 passed, 26 total
4:32:40 AM: Snapshots: 0 total
4:32:40 AM: Time: 10.187s
4:32:40 AM: Task: check-landscape
4:32:40 AM: Task: render-landscape visiting http://localhost:4000/fullscreen?version=2025-02-13T04:31:35Z 6a7287a&scale=false&pdf
4:32:40 AM: visiting http://localhost:4000/fullscreen/members?version=2025-02-13T04:31:35Z 6a7287a&scale=false&pdf
4:32:40 AM: visiting http://localhost:4000/fullscreen/hosting?version=2025-02-13T04:31:35Z 6a7287a&scale=false&pdf
4:32:40 AM: visiting http://localhost:4000/fullscreen?version=2025-02-13T04:31:35Z 6a7287a&scale=false&pdf
4:32:40 AM: visiting http://localhost:4000/fullscreen/members?version=2025-02-13T04:31:35Z 6a7287a&scale=false&pdf
4:32:40 AM: visiting http://localhost:4000/fullscreen/hosting?version=2025-02-13T04:31:35Z 6a7287a&scale=false&pdf
4:32:40 AM: Task: funding From https://github.com/lfai/lfai-landscape
4:32:40 AM: * [new branch] NSouthernLF-patch-1 -> github/NSouthernLF-patch-1
4:32:40 AM: * [new branch] NSouthernLF-patch-2 -> github/NSouthernLF-patch-2
4:32:40 AM: * [new branch] NSouthernLF-patch-3 -> github/NSouthernLF-patch-3
4:32:40 AM: * [new branch] NSouthernLF-patch-3-1 -> github/NSouthernLF-patch-3-1
4:32:40 AM: * [new branch] create-pull-request/patch-1693628094 -> github/create-pull-request/patch-1693628094
4:32:40 AM: * [new branch] create-pull-request/patch-1693714471 -> github/create-pull-request/patch-1693714471
4:32:40 AM: * [new branch] create-pull-request/patch-1703668428 -> github/create-pull-request/patch-1703668428
4:32:40 AM: * [new branch] main -> github/main
4:32:40 AM: * [new branch] revert-303-main -> github/revert-303-main
4:32:40 AM: * [new branch] web-landscape-2021-10-01T22-18 -> github/web-landscape-2021-10-01T22-18
4:32:40 AM: * [new branch] web-landscape-2021-10-12T16-06 -> github/web-landscape-2021-10-12T16-06
4:32:40 AM: * [new branch] web-landscape-2021-10-18T19-46 -> github/web-landscape-2021-10-18T19-46
4:32:40 AM: * [new branch] web-landscape-2021-10-20T18-58 -> github/web-landscape-2021-10-20T18-58
4:32:44 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:32:44 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:32:44 AM: * Documentation: https://help.ubuntu.com
4:32:44 AM: * Management: https://landscape.canonical.com
4:32:44 AM: * Support: https://ubuntu.com/advantage
4:32:44 AM: System information as of Thu Feb 13 04:32:44 UTC 2025
4:32:44 AM: System load: 1.3
4:32:44 AM: Usage of /: 76.3% of 217.51GB
4:32:44 AM: Memory usage: 18%
4:32:44 AM: Swap usage: 1%
4:32:44 AM: Processes: 615
4:32:44 AM: Users logged in: 1
4:32:44 AM: IPv4 address for bond0: 147.75.199.15
4:32:44 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:32:44 AM: IPv4 address for docker0: 172.17.0.1
4:32:44 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:32:44 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:32:44 AM: 82 updates can be applied immediately.
4:32:44 AM: 7 of these updates are standard security updates.
4:32:44 AM: To see these additional updates run: apt list --upgradable
4:32:44 AM: New release '22.04.5 LTS' available.
4:32:44 AM: Run 'do-release-upgrade' to upgrade to it.
4:32:44 AM: 2 updates could not be installed automatically. For more details,
4:32:44 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:32:44 AM: *** System restart required ***
4:32:45 AM: Remote build done!
4:32:45 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:32:45 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:32:45 AM: * Documentation: https://help.ubuntu.com
4:32:45 AM: * Management: https://landscape.canonical.com
4:32:45 AM: * Support: https://ubuntu.com/advantage
4:32:45 AM: System information as of Thu Feb 13 04:30:20 UTC 2025
4:32:45 AM: System load: 0.38
4:32:45 AM: Usage of /: 75.8% of 217.51GB
4:32:45 AM: Memory usage: 18%
4:32:45 AM: Swap usage: 1%
4:32:45 AM: Processes: 649
4:32:45 AM: Users logged in: 1
4:32:45 AM: IPv4 address for bond0: 147.75.199.15
4:32:45 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:32:45 AM: IPv4 address for docker0: 172.17.0.1
4:32:45 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:32:45 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:32:45 AM: 82 updates can be applied immediately.
4:32:45 AM: 7 of these updates are standard security updates.
4:32:45 AM: To see these additional updates run: apt list --upgradable
4:32:45 AM: New release '22.04.5 LTS' available.
4:32:45 AM: Run 'do-release-upgrade' to upgrade to it.
4:32:45 AM: 2 updates could not be installed automatically. For more details,
4:32:45 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:32:45 AM: *** System restart required ***
4:32:45 AM: /opt/buildhome/.nvm/nvm.sh
4:32:45 AM: .:
4:32:45 AM: ADOPTERS.md jest.config.js netlify.toml tools
4:32:45 AM: bin landscapes_dev package.json update_server
4:32:45 AM: build.sh landscapes.sh postcss.config.js _yarn.lock
4:32:45 AM: code-of-conduct.md landscapes.yml README.md yarn.lock
4:32:45 AM: files LICENSE server.js
4:32:45 AM: _headers netlify specs
4:32:45 AM: INSTALL.md netlify.md src
4:32:45 AM: v18.3
4:32:45 AM: Downloading and installing node v18.3.0...
4:32:45 AM: Downloading https://nodejs.org/dist/v18.3.0/node-v18.3.0-linux-x64.tar.xz...
4:32:45 AM: Computing checksum with sha256sum
4:32:45 AM: Checksums matched!
4:32:45 AM: Now using node v18.3.0 (npm v8.11.0)
4:32:45 AM: Now using node v18.3.0 (npm v8.11.0)
4:32:45 AM: npm WARN config global `--global`, `--local` are deprecated. Use `--location=global` instead.
4:32:45 AM:
4:32:45 AM: added 3 packages, and audited 4 packages in 495ms
4:32:45 AM: found 0 vulnerabilities
4:32:45 AM: npm WARN config global `--global`, `--local` are deprecated. Use `--location=global` instead.
4:32:45 AM:
4:32:45 AM: removed 12 packages, changed 105 packages, and audited 263 packages in 2s
4:32:45 AM: 27 packages are looking for funding
4:32:45 AM: run `npm fund` for details
4:32:45 AM: found 0 vulnerabilities
4:32:45 AM: added 1 package in 1s
4:32:45 AM: ➤ YN0000: ┌ Resolution step
4:32:45 AM: ➤ YN0000: └ Completed
4:32:45 AM: ➤ YN0000: ┌ Fetch step
4:32:45 AM: ➤ YN0013: │ 2 packages were already cached, 808 had to be fetched
4:32:45 AM: ➤ YN0000: └ Completed in 5s 619ms
4:32:45 AM: ➤ YN0000: ┌ Link step
4:32:45 AM: ➤ YN0000: │ ESM support for PnP uses the experimental loader API and is therefore experimental
4:32:45 AM: ➤ YN0007: │ puppeteer@npm:14.2.1 must be built because it never has been before or the last one failed
4:32:45 AM: ➤ YN0007: │ puppeteer@npm:13.2.0 must be built because it never has been before or the last one failed
4:32:45 AM: ➤ YN0007: │ yarn@npm:1.22.18 must be built because it never has been before or the last one failed
4:32:45 AM: ➤ YN0000: │ puppeteer@npm:13.2.0 STDERR
4:32:45 AM: ➤ YN0000: │ puppeteer@npm:14.2.1 STDERR
4:32:45 AM: ➤ YN0000: │ puppeteer@npm:13.2.0 STDOUT Chromium (961656) downloaded to /opt/repo/packageRemote/.yarn/unplugged/puppeteer-npm-13.2.0-5595d43df8/node_modules/puppeteer/.local-chromium/linux-961656
4:32:45 AM: ➤ YN0000: │ puppeteer@npm:14.2.1 STDOUT Chromium (1002410) downloaded to /opt/repo/packageRemote/.yarn/unplugged/puppeteer-npm-14.2.1-e2757bbf6c/node_modules/puppeteer/.local-chromium/linux-1002410
4:32:45 AM: ➤ YN0000: └ Completed in 6s 556ms
4:32:45 AM: ➤ YN0000: Done with warnings in 12s 581ms
4:32:45 AM: Processing the tree
4:32:45 AM: Running with a level=easy. Settings:
4:32:45 AM: Use cached crunchbase data: true
4:32:45 AM: Use cached images data: true
4:32:45 AM: Use cached twitter data: true
4:32:45 AM: Use cached github basic stats: true
4:32:45 AM: Use cached github start dates: true
4:32:45 AM: Use cached best practices: true
4:32:45 AM: Fetching crunchbase entries
4:32:45 AM: ................................................................................
4:32:45 AM: ................................................................................
4:32:45 AM: ................................................................................
4:32:45 AM: ................................................................................
4:32:45 AM: ................................................................................
4:32:45 AM: ....................................................**
4:32:45 AM: Fetching github entries
4:32:45 AM: ................................................................................
4:32:45 AM: ................................................................................
4:32:45 AM: ..................................*********************.........................
4:32:45 AM: ................................................................................
4:32:45 AM: ...................................................................*********
4:32:45 AM: Fetching start date entries
4:32:45 AM: ................................................................................
4:32:45 AM: ................................................................................
4:32:45 AM: ............................................***********.........................
4:32:45 AM: ................................................................................
4:32:45 AM: .........................................................*******************
4:32:45 AM: Fetching images
4:32:45 AM: got image entries
4:32:45 AM: Hash for Prefect is prefect-2
4:32:45 AM: ................................................................................
4:32:45 AM: ....**......**..................................................................
4:32:45 AM: ................................................................................
4:32:45 AM: ................................................................................
4:32:45 AM: ................................................................................
4:32:45 AM: ................................................................................
4:32:45 AM: Fetching last tweet dates
4:32:45 AM: Fetching best practices
4:32:45 AM: ................................................................................
4:32:45 AM: ................................................................................
4:32:45 AM: ................................................................................
4:32:45 AM: ................................................................................
4:32:45 AM: ...............................................
4:32:45 AM: Fetching CLOMonitor data
4:32:45 AM: Processing the tree
4:32:45 AM: saving!
4:32:45 AM: Hash for Prefect is prefect-2
4:32:45 AM: Warning: Open Platform for Enterprise AI (OPEA) has a twitter https://twitter.com/opeadev which is invalid or we just can not fetch its tweets
4:32:45 AM: Warning: Dragonfly has a twitter https://twitter.com/dragonfly_oss which is invalid or we just can not fetch its tweets
4:32:45 AM: Warning: Vald has a twitter https://twitter.com/vdaas_vald which is invalid or we just can not fetch its tweets
4:32:45 AM: Warning: FastTrackML has a twitter https://twitter.com/oss_gr which is invalid or we just can not fetch its tweets
4:32:45 AM: Warning: Mathesar has a twitter https://twitter.com/centerofci which is invalid or we just can not fetch its tweets
4:32:45 AM: Warning: RWKV has a twitter https://twitter.com/RWKV_AI which is invalid or we just can not fetch its tweets
4:32:45 AM: Warning: Langchain has a twitter https://twitter.com/LangChainAI which is invalid or we just can not fetch its tweets
4:32:45 AM: Warning: Haystack has a twitter https://twitter.com/deepset_ai which is invalid or we just can not fetch its tweets
4:32:45 AM: Warning: Armada has a twitter https://twitter.com/oss_gr which is invalid or we just can not fetch its tweets
4:32:45 AM: Warning: envd has a twitter https://twitter.com/TensorChord which is invalid or we just can not fetch its tweets
4:32:45 AM: Warning: NVIDIA Corporation has a twitter https://twitter.com/nvidia which is invalid or we just can not fetch its tweets
4:32:45 AM: Warning: Advanced Micro Devices (AMD) has a twitter https://twitter.com/AMD which is invalid or we just can not fetch its tweets
4:32:45 AM: Warning: Alibaba Cloud (Singapore) Private LTD has a twitter https://twitter.com/AlibabaB2B which is invalid or we just can not fetch its tweets
4:32:45 AM: Warning: Fujitsu Limited has a twitter https://twitter.com/Fujitsu_Global which is invalid or we just can not fetch its tweets
4:32:45 AM: Warning: GitCode has a twitter https://twitter.com/turingbook which is invalid or we just can not fetch its tweets
4:32:45 AM: Warning: Neo4j, Inc. has a twitter https://twitter.com/neo4j which is invalid or we just can not fetch its tweets
4:32:45 AM: Warning: Posit has a twitter https://twitter.com/posit_pbc which is invalid or we just can not fetch its tweets
4:32:45 AM: Warning: Snyk Limited has a twitter https://twitter.com/snyksec which is invalid or we just can not fetch its tweets
4:32:45 AM: Warning: AI for People has a twitter https://twitter.com/AIforPeople which is invalid or we just can not fetch its tweets
4:32:45 AM: Warning: Ambianic.ai has a twitter https://twitter.com/ambianicai which is invalid or we just can not fetch its tweets
4:32:45 AM: Warning: Columbia University has a twitter https://twitter.com/Columbia which is invalid or we just can not fetch its tweets
4:32:45 AM: Warning: Common Crawl Foundation has a twitter https://twitter.com/commoncrawl which is invalid or we just can not fetch its tweets
4:32:45 AM: Warning: Ersilia Open Source Initiative has a twitter https://twitter.com/ersiliaio which is invalid or we just can not fetch its tweets
4:32:45 AM: Warning: Ewha Institute for Biomedical Law & Ethics has a twitter https://twitter.com/cybercampus which is invalid or we just can not fetch its tweets
4:32:45 AM: Warning: Montreal AI Ethics Institute has a twitter https://twitter.com/mtlaiethics which is invalid or we just can not fetch its tweets
4:32:45 AM: Warning: NIPA has a twitter https://twitter.com/NIPAkr which is invalid or we just can not fetch its tweets
4:32:45 AM: Warning: One Fact Foundation has a twitter https://twitter.com/onefact_org which is invalid or we just can not fetch its tweets
4:32:45 AM: Warning: Pranveer Singh Institute Of Technology has a twitter https://twitter.com/PSITKanpur2004 which is invalid or we just can not fetch its tweets
4:32:45 AM: Warning: Rensselaer Polytechnic Institute has a twitter https://twitter.com/rpi which is invalid or we just can not fetch its tweets
4:32:45 AM: Warning: Transport for NSW has a twitter https://twitter.com/TransportforNSW which is invalid or we just can not fetch its tweets
4:32:45 AM: Warning: University of Michigan has a twitter https://twitter.com/UMich which is invalid or we just can not fetch its tweets
4:32:45 AM: Warning: Wikimedia Deutschland e. V. has a twitter https://twitter.com/WikimediaDE which is invalid or we just can not fetch its tweets
4:32:45 AM: Warning: World Ethical Data Foundation has a twitter https://twitter.com/WEDF_foundation which is invalid or we just can not fetch its tweets
4:32:45 AM: Warning: BasicAI(hosting) has a twitter https://twitter.com/BasicAIteam which is invalid or we just can not fetch its tweets
4:32:45 AM: Warning: Fujitsu Limited(hosting) has a twitter https://twitter.com/Fujitsu_Global which is invalid or we just can not fetch its tweets
4:32:45 AM: Warning: Owkin(hosting) has a twitter https://twitter.com/owkinscience which is invalid or we just can not fetch its tweets
4:32:45 AM: Fetching members from LF AI & Data Member Company category
4:32:45 AM: Processing the tree
4:32:45 AM: Assigning Premier membership on DataHub (LinkedIn) because its parent Microsoft has Premier membership
4:32:45 AM: Assigning Premier membership on Brooklin (LinkedIn) because its parent Microsoft has Premier membership
4:32:45 AM: Assigning Premier membership on Azkaban (LinkedIn) because its parent Microsoft has Premier membership
4:32:45 AM: Assigning Premier membership on LinkedIn (hosting) (LinkedIn) because its parent Microsoft has Premier membership
4:32:45 AM: Hash for Chaitanya Bharathi Institute of Technology is chaitanya-bharathi-institute-of-technology-2
4:32:45 AM: Hash for Fast.ai is fast-ai-2
4:32:45 AM: Hash for Great Expectations is great-expectations-2
4:32:45 AM: Hash for ML Perf is ml-perf-2
4:32:45 AM: Hash for PipelineAI is pipeline-ai-2
4:32:45 AM: Hash for Prefect is prefect-2
4:32:45 AM: Hash for Redash is redash-2
4:32:45 AM: Hash for Wikimedia Deutschland e.V. is wikimedia-deutschland-e-v-2
4:32:45 AM: {
4:32:45 AM: name: 'Accord.NET',
4:32:45 AM: homepage_url: 'http://accord-framework.net/',
4:32:45 AM: repo_url: 'https://github.com/accord-net/framework',
4:32:45 AM: logo: 'accord-net.svg',
4:32:45 AM: crunchbase: 'https://www.crunchbase.com/organization/accord-net-framework-project',
4:32:45 AM: github_data: {
4:32:45 AM: languages: [
4:32:45 AM: [Object], [Object],
4:32:45 AM: [Object], [Object],
4:32:45 AM: [Object], [Object],
4:32:45 AM: [Object], [Object],
4:32:45 AM: [Object], [Object],
4:32:45 AM: [Object], [Object],
4:32:45 AM: [Object], [Object],
4:32:45 AM: [Object]
4:32:45 AM: ],
4:32:45 AM: contributions: '0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0',
4:32:45 AM: firstWeek: '2022-11-27Z',
4:32:45 AM: stars: 4404,
4:32:45 AM: license: 'GNU Lesser General Public License v2.1',
4:32:45 AM: description: 'Machine learning, computer vision, statistics and general scientific computing for .NET',
4:32:45 AM: latest_commit_date: '2020-11-18T19:53:01Z',
4:32:45 AM: latest_commit_link: '/accord-net/framework/commit/1ab0cc0ba55bcc3d46f20e7bbe7224b58cd01854',
4:32:45 AM: release_date: '2017-10-19T21:00:56Z',
4:32:45 AM: release_link: 'https://github.com/accord-net/framework/releases',
4:32:45 AM: contributors_count: 98,
4:32:45 AM: contributors_link: 'https://github.com/accord-net/framework/graphs/contributors'
4:32:45 AM: },
4:32:45 AM: repos: [ { url: 'https://github.com/accord-net/framework', stars: 4404 } ],
4:32:45 AM: github_start_commit_data: {
4:32:45 AM: start_commit_link: '/accord-net/framework/commit/5e834efc89c071bb32c84f41124d5c9a5bbfe1e3',
4:32:45 AM: start_date: '2012-04-08T14:05:58Z'
4:32:45 AM: },
4:32:45 AM: image_data: {
4:32:45 AM: fileName: 'accord-net.svg',
4:32:45 AM: hash: 'tdvFWfo6hfZQqHqjscAtcoZWk6qztZFzBTVqKWUjYig='
4:32:45 AM: },
4:32:45 AM: firstCommitDate: '2012-04-08T14:05:58Z',
4:32:45 AM: firstCommitLink: 'https://github.com/accord-net/framework/commit/5e834efc89c071bb32c84f41124d5c9a5bbfe1e3',
4:32:45 AM: latestCommitDate: '2020-11-18T19:53:01Z',
4:32:45 AM: latestCommitLink: 'https://github.com/accord-net/framework/commit/1ab0cc0ba55bcc3d46f20e7bbe7224b58cd01854',
4:32:45 AM: releaseDate: '2017-10-19T21:00:56Z',
4:32:45 AM: releaseLink: 'https://github.com/accord-net/framework/releases',
4:32:45 AM: commitsThisYear: 0,
4:32:45 AM: contributorsCount: 98,
4:32:45 AM: contributorsLink: 'https://github.com/accord-net/framework/graphs/contributors',
4:32:45 AM: language: 'C#',
4:32:45 AM: stars: 4404,
4:32:45 AM: license: 'GNU Lesser General Public License v2.1',
4:32:45 AM: headquarters: 'Grenoble, France',
4:32:45 AM: description: 'Machine learning, computer vision, statistics and general scientific computing for .NET',
4:32:45 AM: organization: 'Accord.NET Framework',
4:32:45 AM: crunchbaseData: {
4:32:45 AM: name: 'Accord.NET Framework',
4:32:45 AM: description: 'Machine Learning Framework',
4:32:45 AM: homepage: 'http://accord-framework.net/',
4:32:45 AM: city: 'Grenoble',
4:32:45 AM: region: 'Rhone-Alpes',
4:32:45 AM: country: 'France',
4:32:45 AM: twitter: null,
4:32:45 AM: linkedin: null,
4:32:45 AM: acquisitions: [],
4:32:45 AM: parents: [],
4:32:45 AM: stockExchange: null,
4:32:45 AM: company_type: 'Non Profit',
4:32:45 AM: industries: [
4:32:45 AM: 'Analytics',
4:32:45 AM: 'Artificial Intelligence',
4:32:45 AM: 'Hardware',
4:32:45 AM: 'Machine Learning'
4:32:45 AM: ],
4:32:45 AM: numEmployeesMin: null,
4:32:45 AM: numEmployeesMax: null
4:32:45 AM: },
4:32:45 AM: path: 'Machine Learning / Framework',
4:32:45 AM: landscape: 'Machine Learning / Framework',
4:32:45 AM: category: 'Machine Learning',
4:32:45 AM: amount: 'N/A',
4:32:45 AM: oss: true,
4:32:45 AM: href: 'logos/accord-net.svg',
4:32:45 AM: bestPracticeBadgeId: false,
4:32:45 AM: bestPracticePercentage: null,
4:32:45 AM: industries: [
4:32:45 AM: 'Analytics',
4:32:45 AM: 'Artificial Intelligence',
4:32:45 AM: 'Hardware',
4:32:45 AM: 'Machine Learning'
4:32:45 AM: ],
4:32:45 AM: starsPresent: true,
4:32:45 AM: starsAsText: '4,404',
4:32:45 AM: marketCapPresent: false,
4:32:45 AM: marketCapAsText: 'N/A',
4:32:45 AM: id: 'accord-net',
4:32:45 AM: flatName: 'Accord.NET',
4:32:45 AM: member: false,
4:32:45 AM: relation: false,
4:32:45 AM: isSubsidiaryProject: false
4:32:45 AM: } 2020-11-18T19:53:01Z
4:32:45 AM: [
4:32:45 AM: 'Community Data License Agreement (CDLA)',
4:32:45 AM: 'PlaNet',
4:32:45 AM: 'Generic Neural Elastic Search (GNES)',
4:32:45 AM: 'PredictionIO',
4:32:45 AM: 'ELI5',
4:32:45 AM: 'BERT',
4:32:45 AM: 'Nauta',
4:32:45 AM: 'DAWNBench',
4:32:45 AM: 'AresDB',
4:32:45 AM: 'dotmesh',
4:32:45 AM: 'Audit AI',
4:32:45 AM: 'euler',
4:32:45 AM: 'Clipper',
4:32:45 AM: 'Accord.NET',
4:32:45 AM: 'Shogun',
4:32:45 AM: 'DELTA',
4:32:45 AM: 'BeakerX',
4:32:45 AM: 'PixieDust',
4:32:45 AM: 'TreeInterpreter',
4:32:45 AM: 'Cyclone',
4:32:45 AM: 'Lucid',
4:32:45 AM: 'XLM',
4:32:45 AM: 'Chainer RL',
4:32:45 AM: 'ForestFlow',
4:32:45 AM: 'uReplicator',
4:32:45 AM: 'Elastic Deep Learning (EDL)',
4:32:45 AM: 'Kashgari',
4:32:45 AM: 'X-DeepLearning',
4:32:45 AM: 'LIME',
4:32:45 AM: 'Model Asset eXchange (MAX)',
4:32:45 AM: 'TransmogrifAI',
4:32:45 AM: 'OpenBytes',
4:32:45 AM: 'DeepLIFT',
4:32:45 AM: 'Onepanel',
4:32:45 AM: 'DeepSpeech',
4:32:45 AM: 'Lucene',
4:32:45 AM: 'Turi Create',
4:32:45 AM: 'Visual Object Tagging Tool (VoTT)',
4:32:45 AM: 'Acumos',
4:32:45 AM: 'Skater',
4:32:45 AM: 'Catalyst',
4:32:45 AM: 'SKIP Language',
4:32:45 AM: 'SQLFlow',
4:32:45 AM: 'Advertorch',
4:32:45 AM: 'xLearn',
4:32:45 AM: 'Neuropod',
4:32:45 AM: 'AdvBox',
4:32:45 AM: 'RCloud',
4:32:45 AM: 'Neo-AI',
4:32:45 AM: 'Embedded Learning Library',
4:32:45 AM: 'Stable Baselines',
4:32:45 AM: 'talos',
4:32:45 AM: 'LabelImg',
4:32:45 AM: 'MMdnn',
4:32:45 AM: 'CNTK',
4:32:45 AM: 'Machine Learning eXchange',
4:32:45 AM: 'Singularity',
4:32:45 AM: 'Chainer',
4:32:45 AM: 'PyText',
4:32:45 AM: 'Pipeline.ai',
4:32:45 AM: 'Apache Bahir',
4:32:45 AM: 'NLP Architect',
4:32:45 AM: 'AllenNLP',
4:32:45 AM: 'Angel-ML',
4:32:45 AM: 'SEED RL',
4:32:45 AM: 'Coach',
4:32:45 AM: 'Gluon-NLP',
4:32:45 AM: 'DeepMind Lab',
4:32:45 AM: 'SEAL',
4:32:45 AM: 'MXNet',
4:32:45 AM: 'OpenAI Gym',
4:32:45 AM: 'MindMeld',
4:32:45 AM: 'CleverHans',
4:32:45 AM: 'Petastorm',
4:32:45 AM: 'Hawq',
4:32:45 AM: 'TF Encrypted',
4:32:45 AM: 'faust',
4:32:45 AM: 'Cortex',
4:32:45 AM: 'OpenDataology',
4:32:45 AM: 'YouTokenToMe',
4:32:45 AM: 'ALBERT',
4:32:45 AM: 'Adlik',
4:32:45 AM: '1chipML',
4:32:45 AM: 'Neural Network Distiller',
4:32:45 AM: 'Labelbox',
4:32:45 AM: 'Facets',
4:32:45 AM: 'OpenNN',
4:32:45 AM: 'Pilosa',
4:32:45 AM: 'Orchest',
4:32:45 AM: 'Model Server for Apache MXNet',
4:32:45 AM: 'LASER',
4:32:45 AM: 'Dopamine',
4:32:45 AM: 'MindSpore',
4:32:45 AM: 'HE Lib',
4:32:45 AM: 'd6tflow',
4:32:45 AM: 'Sonnet',
4:32:45 AM: 'Plaid ML',
4:32:45 AM: 'Nyoka',
4:32:45 AM: 'doccano',
4:32:45 AM: 'ecco',
4:32:45 AM: ... 252 more items
4:32:45 AM: ]
4:32:45 AM: ncc: Version 0.34.0
4:32:45 AM: ncc: Compiling file index.js into CJS
4:32:45 AM: ncc: Version 0.34.0
4:32:45 AM: ncc: Compiling file index.js into CJS
4:32:45 AM: ncc: Version 0.34.0
4:32:45 AM: ncc: Compiling file index.js into CJS
4:32:45 AM: Development server running at http://127.0.0.1:4000/
4:32:45 AM: Running integration-test,check-landscape,render-landscape,funding in parallel
4:32:45 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:32:45 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:32:45 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:32:45 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:32:45 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:32:45 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:32:45 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:32:45 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:32:45 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:32:45 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:32:45 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:32:45 AM: api request starting... /api/ids?category=&project=&license=&organization=accord-net-framework&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:32:45 AM: api request starting... /api/ids?category=&project=&license=&organization=microsoft&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:32:45 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:32:45 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:32:45 AM: Task: integration-test PASS specs/main.spec.js (9.955s)
4:32:45 AM: Main test
4:32:45 AM: I visit a main page and have all required elements
4:32:45 AM: ✓ I can open a page (1622ms)
4:32:45 AM: ✓ A proper header is present (6ms)
4:32:45 AM: ✓ Group headers are ok (2ms)
4:32:45 AM: ✓ I see a You are viewing text (2ms)
4:32:45 AM: ✓ A proper card is present (3ms)
4:32:45 AM: ✓ If I click on a card, I see a modal dialog (352ms)
4:32:45 AM: ✓ Closing a browser (28ms)
4:32:45 AM: Landscape Test
4:32:45 AM: ✓ I visit LF AI & Data Members landscape page and have all required elements, elements are clickable (934ms)
4:32:45 AM: ✓ Closing a browser (22ms)
4:32:45 AM: ✓ I visit Companies Hosting Projects landscape page and have all required elements, elements are clickable (901ms)
4:32:45 AM: ✓ Closing a browser (22ms)
4:32:45 AM: I visit a main landscape page and have all required elements
4:32:45 AM: ✓ I open a landscape page and wait for it to load (1853ms)
4:32:45 AM: ✓ When I click on an item the modal is open (72ms)
4:32:45 AM: ✓ If I would straight open the url with a selected id, a modal appears (2015ms)
4:32:45 AM: ✓ Closing a browser (33ms)
4:32:45 AM: Filtering by organization
4:32:45 AM: ✓ Checking we see Accord.NET when filtering by organization Accord.NET Framework (733ms)
4:32:45 AM: ✓ Checking we don't see Accord.NET when filtering by organization Microsoft (301ms)
✓ Closing a browser (25ms)
✓ Closing a browser (25ms)
4:32:45 AM: PASS specs/tools/actualTwitter.spec.js
4:32:45 AM: Twitter URL
4:32:45 AM: when crunchbase data not set
4:32:45 AM: ✓ returns URL from node (2ms)
4:32:45 AM: when node does not have twitter URL
4:32:45 AM: ✓ returns URL from node
4:32:45 AM: when node has twitter URL set to null
4:32:45 AM: ✓ returns undefined (1ms)
4:32:45 AM: when both node and crunchbase have twitter URL
4:32:45 AM: ✓ returns URL from node
4:32:45 AM: when twitter URL is not set anywhere
4:32:45 AM: ✓ returns undefined
4:32:45 AM: cleaning up twitter URL
4:32:45 AM: ✓ replaces http with https
4:32:45 AM: ✓ removes www
4:32:45 AM: ✓ query string (1ms)
4:32:45 AM: Test Suites: 2 passed, 2 total
4:32:45 AM: Tests: 26 passed, 26 total
4:32:45 AM: Snapshots: 0 total
4:32:45 AM: Time: 10.187s
4:32:45 AM: Task: check-landscape
4:32:45 AM: Task: render-landscape visiting http://localhost:4000/fullscreen?version=2025-02-13T04:31:35Z 6a7287a&scale=false&pdf
4:32:45 AM: visiting http://localhost:4000/fullscreen/members?version=2025-02-13T04:31:35Z 6a7287a&scale=false&pdf
4:32:45 AM: visiting http://localhost:4000/fullscreen/hosting?version=2025-02-13T04:31:35Z 6a7287a&scale=false&pdf
4:32:45 AM: visiting http://localhost:4000/fullscreen?version=2025-02-13T04:31:35Z 6a7287a&scale=false&pdf
4:32:45 AM: visiting http://localhost:4000/fullscreen/members?version=2025-02-13T04:31:35Z 6a7287a&scale=false&pdf
4:32:45 AM: visiting http://localhost:4000/fullscreen/hosting?version=2025-02-13T04:31:35Z 6a7287a&scale=false&pdf
4:32:45 AM: Task: funding From https://github.com/lfai/lfai-landscape
4:32:45 AM: * [new branch] NSouthernLF-patch-1 -> github/NSouthernLF-patch-1
4:32:45 AM: * [new branch] NSouthernLF-patch-2 -> github/NSouthernLF-patch-2
4:32:45 AM: * [new branch] NSouthernLF-patch-3 -> github/NSouthernLF-patch-3
4:32:45 AM: * [new branch] NSouthernLF-patch-3-1 -> github/NSouthernLF-patch-3-1
4:32:45 AM: * [new branch] create-pull-request/patch-1693628094 -> github/create-pull-request/patch-1693628094
4:32:45 AM: * [new branch] create-pull-request/patch-1693714471 -> github/create-pull-request/patch-1693714471
4:32:45 AM: * [new branch] create-pull-request/patch-1703668428 -> github/create-pull-request/patch-1703668428
4:32:45 AM: * [new branch] main -> github/main
4:32:45 AM: * [new branch] revert-303-main -> github/revert-303-main
4:32:45 AM: * [new branch] web-landscape-2021-10-01T22-18 -> github/web-landscape-2021-10-01T22-18
4:32:45 AM: * [new branch] web-landscape-2021-10-12T16-06 -> github/web-landscape-2021-10-12T16-06
4:32:45 AM: * [new branch] web-landscape-2021-10-18T19-46 -> github/web-landscape-2021-10-18T19-46
4:32:45 AM: * [new branch] web-landscape-2021-10-20T18-58 -> github/web-landscape-2021-10-20T18-58
4:32:45 AM:
4:32:45 AM: (build.command completed in 2m 31.9s)
4:32:45 AM:
4:32:45 AM: Functions bundling
4:32:45 AM: ────────────────────────────────────────────────────────────────
4:32:45 AM:
4:32:45 AM: Packaging Functions from /opt/build/repo/functions directory:
4:32:45 AM: - export.js
4:32:45 AM: - ids.js
4:32:45 AM: - items.js
4:32:45 AM:
4:32:50 AM:
4:32:50 AM: (Functions bundling completed in 5.5s)
4:32:50 AM:
4:33:00 AM: (Netlify Build completed in 2m 47.1s)
4:33:01 AM: Section completed: building
4:33:07 AM: Finished processing build request in 3m22.219s
Deploying
Complete
Deploying
Complete
4:32:50 AM: Deploy site
4:32:50 AM: ────────────────────────────────────────────────────────────────
4:32:50 AM:
4:32:50 AM: Starting to deploy site from 'dist'
4:32:51 AM: Calculating files to upload
4:32:53 AM: 10 new file(s) to upload
4:32:53 AM: 3 new function(s) to upload
4:33:00 AM: Section completed: deploying
4:33:00 AM: Site deploy was successfully initiated
4:33:00 AM:
4:33:00 AM: (Deploy site completed in 9.5s)
Cleanup
Complete
Cleanup
Complete
4:33:00 AM: Netlify Build Complete
4:33:00 AM: ────────────────────────────────────────────────────────────────
4:33:00 AM:
4:33:01 AM: Caching artifacts
4:33:01 AM: Started saving build plugins
4:33:01 AM: Finished saving build plugins
4:33:01 AM: Started saving mise cache
4:33:01 AM: Finished saving mise cache
4:33:01 AM: Started saving pip cache
4:33:01 AM: Finished saving pip cache
4:33:01 AM: Started saving emacs cask dependencies
4:33:01 AM: Finished saving emacs cask dependencies
4:33:01 AM: Started saving maven dependencies
4:33:01 AM: Finished saving maven dependencies
4:33:01 AM: Started saving boot dependencies
4:33:01 AM: Finished saving boot dependencies
4:33:01 AM: Started saving rust rustup cache
4:33:01 AM: Finished saving rust rustup cache
4:33:01 AM: Started saving go dependencies
4:33:01 AM: Finished saving go dependencies
4:33:01 AM: Build script success
4:33:06 AM: Uploading Cache of size 195.4MB
4:33:07 AM: Section completed: cleanup
Post-processing
Complete
Post-processing
Complete
4:33:00 AM: Post processing - redirect rules
4:33:00 AM: Post processing done
4:33:00 AM: Section completed: postprocessing
4:33:00 AM: Starting post processing
4:33:00 AM: Skipping form detection
4:33:00 AM: Post processing - header rules
4:33:01 AM: Site is live ✨