Skip to main content

Deploy details

Deploy successful for lfailandscape

Update Landscape from LFX 2025-02-13

PR #839: create-pull-request/patch-1739420663@60a7223

Deploy log

Initializing

Complete
4:24:27 AM: Build ready to start
4:24:41 AM: build-image version: 9c9fb6952e50bb092d4b66daf2368677e5c68e34 (focal)
4:24:41 AM: buildbot version: 9c9fb6952e50bb092d4b66daf2368677e5c68e34
4:24:41 AM: Fetching cached dependencies
4:24:42 AM: Starting to download cache of 216.2MB
4:24:44 AM: Finished downloading cache in 2.212s
4:24:44 AM: Starting to extract cache
4:24:45 AM: Finished extracting cache in 1.654s
4:24:45 AM: Finished fetching cache in 3.93s
4:24:46 AM: Starting to prepare the repo for build
4:24:46 AM: Preparing Git Reference pull/839/head
4:24:47 AM: Custom build path detected. Proceeding with the specified path: 'netlify'
4:24:47 AM: Custom functions path detected. Proceeding with the specified path: 'functions'
4:24:47 AM: Custom build command detected. Proceeding with the specified command: '(wget --no-check-certificate --no-cache https://raw.githubusercontent.com/cncf/landscapeapp/master/netlify/landscape.js) && node landscape.js'
4:24:47 AM: Custom ignore command detected. Proceeding with the specified command: 'false'
4:24:47 AM: manpath: warning: $PATH not set
4:24:48 AM: Starting to install dependencies
4:24:48 AM: Started restoring cached mise cache
4:24:49 AM: Finished restoring cached mise cache
4:24:49 AM: mise python@3.13.2 install
4:24:49 AM: mise python@3.13.2 download cpython-3.13.2+20250212-x86_64-unknown-linux-gnu-install_only_stripped.tar.gz
4:24:50 AM: mise python@3.13.2 extract cpython-3.13.2+20250212-x86_64-unknown-linux-gnu-install_only_stripped.tar.gz
4:24:50 AM: mise python@3.13.2 python --version
4:24:50 AM: mise python@3.13.2 Python 3.13.2
4:24:50 AM: mise python@3.13.2 installed
4:24:50 AM: Python version set to 3.13
4:24:52 AM: Collecting pipenv
4:24:52 AM: Downloading pipenv-2024.4.1-py3-none-any.whl.metadata (17 kB)
4:24:52 AM: Collecting certifi (from pipenv)
4:24:52 AM: Downloading certifi-2025.1.31-py3-none-any.whl.metadata (2.5 kB)
4:24:52 AM: Collecting packaging>=22 (from pipenv)
4:24:52 AM: Downloading packaging-24.2-py3-none-any.whl.metadata (3.2 kB)
4:24:52 AM: Collecting setuptools>=67 (from pipenv)
4:24:52 AM: Downloading setuptools-75.8.0-py3-none-any.whl.metadata (6.7 kB)
4:24:52 AM: Collecting virtualenv>=20.24.2 (from pipenv)
4:24:52 AM: Downloading virtualenv-20.29.2-py3-none-any.whl.metadata (4.5 kB)
4:24:52 AM: Collecting distlib<1,>=0.3.7 (from virtualenv>=20.24.2->pipenv)
4:24:52 AM: Downloading distlib-0.3.9-py2.py3-none-any.whl.metadata (5.2 kB)
4:24:52 AM: Collecting filelock<4,>=3.12.2 (from virtualenv>=20.24.2->pipenv)
4:24:52 AM: Downloading filelock-3.17.0-py3-none-any.whl.metadata (2.9 kB)
4:24:52 AM: Collecting platformdirs<5,>=3.9.1 (from virtualenv>=20.24.2->pipenv)
4:24:52 AM: Downloading platformdirs-4.3.6-py3-none-any.whl.metadata (11 kB)
4:24:52 AM: Downloading pipenv-2024.4.1-py3-none-any.whl (3.0 MB)
4:24:52 AM: ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 3.0/3.0 MB 63.4 MB/s eta 0:00:00
4:24:52 AM: Downloading packaging-24.2-py3-none-any.whl (65 kB)
4:24:52 AM: Downloading setuptools-75.8.0-py3-none-any.whl (1.2 MB)
4:24:52 AM: ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.2/1.2 MB 65.9 MB/s eta 0:00:00
4:24:52 AM: Downloading virtualenv-20.29.2-py3-none-any.whl (4.3 MB)
4:24:52 AM: ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 4.3/4.3 MB 168.6 MB/s eta 0:00:00
4:24:52 AM: Downloading certifi-2025.1.31-py3-none-any.whl (166 kB)
4:24:52 AM: Downloading distlib-0.3.9-py2.py3-none-any.whl (468 kB)
4:24:52 AM: Downloading filelock-3.17.0-py3-none-any.whl (16 kB)
4:24:52 AM: Downloading platformdirs-4.3.6-py3-none-any.whl (18 kB)
4:24:53 AM: Installing collected packages: distlib, setuptools, platformdirs, packaging, filelock, certifi, virtualenv, pipenv
4:24:55 AM: Successfully installed certifi-2025.1.31 distlib-0.3.9 filelock-3.17.0 packaging-24.2 pipenv-2024.4.1 platformdirs-4.3.6 setuptools-75.8.0 virtualenv-20.29.2
4:24:55 AM: [notice] A new release of pip is available: 24.3.1 -> 25.0.1
4:24:55 AM: [notice] To update, run: pip install --upgrade pip
4:24:55 AM: Attempting Ruby version 2.6.2, read from environment
4:24:55 AM: Started restoring cached Ruby version
4:24:55 AM: Finished restoring cached Ruby version
4:24:56 AM: Using Ruby version 2.6.2
4:24:56 AM: Started restoring cached go cache
4:24:56 AM: Finished restoring cached go cache
4:24:56 AM: Installing Go version 1.12 (requested 1.12)
4:25:00 AM: go version go1.12 linux/amd64
4:25:01 AM: Using PHP version 8.0
4:25:02 AM: Started restoring cached Node.js version
4:25:03 AM: Finished restoring cached Node.js version
4:25:04 AM: v14.3.0 is already installed.
4:25:04 AM: Now using node v14.3.0 (npm v6.14.5)
4:25:04 AM: Started restoring cached build plugins
4:25:04 AM: Finished restoring cached build plugins
4:25:04 AM: Successfully installed dependencies
4:25:04 AM: Starting build script
4:25:06 AM: Detected 1 framework(s)
4:25:06 AM: "cecil" at version "unknown"
4:25:06 AM: Section completed: initializing

Building

Complete
4:25:07 AM: Netlify Build
4:25:07 AM: ────────────────────────────────────────────────────────────────
4:25:07 AM:
4:25:07 AM: ❯ Version
4:25:07 AM: @netlify/build 29.58.9
4:25:07 AM:
4:25:07 AM: ❯ Flags
4:25:07 AM: accountId: 5a55185e8198766884f04205
4:25:07 AM: baseRelDir: false
4:25:07 AM: buildId: 67ad73faf2fc650008f89e0d
4:25:07 AM: deployId: 67ad73faf2fc650008f89e0f
4:25:07 AM:
4:25:07 AM: ❯ Current directory
4:25:07 AM: /opt/build/repo/netlify
4:25:07 AM:
4:25:07 AM: ❯ Config file
4:25:07 AM: /opt/build/repo/netlify.toml
4:25:07 AM:
4:25:07 AM: ❯ Context
4:25:07 AM: deploy-preview
4:25:07 AM:
4:25:07 AM: build.command from netlify.toml
4:25:07 AM: ────────────────────────────────────────────────────────────────
4:25:07 AM: ​
4:25:07 AM: $ (wget --no-check-certificate --no-cache https://raw.githubusercontent.com/cncf/landscapeapp/master/netlify/landscape.js) && node landscape.js
4:25:07 AM: Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 185.199.108.133, 185.199.111.133, 185.199.109.133, ...
4:25:07 AM: Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|185.199.108.133|:443... connected.
4:25:07 AM: HTTP request sent, awaiting response... 200 OK
4:25:07 AM: Length: 8750 (8.5K) [text/plain]
4:25:07 AM: Saving to: ‘landscape.js’
4:25:07 AM: 0K ........ 100% 109M=0s
4:25:07 AM: 2025-02-13 04:25:07 (109 MB/s) - ‘landscape.js’ saved [8750/8750]
4:25:07 AM: We have a secret: c8***75
4:25:07 AM: We have a secret: 8G***pb
4:25:07 AM: We have a secret: 87***eb
4:25:07 AM: We have a secret: gh***7r
4:25:07 AM: starting /opt/build/repo/netlify
4:25:07 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:25:08 AM: Warning: Permanently added '147.75.199.15' (ECDSA) to the list of known hosts.
4:25:08 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:25:08 AM: * Documentation: https://help.ubuntu.com
4:25:08 AM: * Management: https://landscape.canonical.com
4:25:08 AM: * Support: https://ubuntu.com/advantage
4:25:08 AM: System information as of Thu Feb 13 04:25:08 UTC 2025
4:25:08 AM: System load: 0.0
4:25:08 AM: Usage of /: 73.4% of 217.51GB
4:25:08 AM: Memory usage: 20%
4:25:08 AM: Swap usage: 1%
4:25:08 AM: Processes: 590
4:25:08 AM: Users logged in: 1
4:25:08 AM: IPv4 address for bond0: 147.75.199.15
4:25:08 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:25:08 AM: IPv4 address for docker0: 172.17.0.1
4:25:08 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:25:08 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:25:08 AM: 82 updates can be applied immediately.
4:25:08 AM: 7 of these updates are standard security updates.
4:25:08 AM: To see these additional updates run: apt list --upgradable
4:25:08 AM: New release '22.04.5 LTS' available.
4:25:08 AM: Run 'do-release-upgrade' to upgrade to it.
4:25:08 AM: 2 updates could not be installed automatically. For more details,
4:25:08 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:25:08 AM: *** System restart required ***
4:25:08 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:25:08 AM: Warning: Permanently added '147.75.199.15' (ECDSA) to the list of known hosts.
4:25:08 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:25:08 AM: * Documentation: https://help.ubuntu.com
4:25:08 AM: * Management: https://landscape.canonical.com
4:25:08 AM: * Support: https://ubuntu.com/advantage
4:25:08 AM: System information as of Thu Feb 13 04:25:08 UTC 2025
4:25:08 AM: System load: 0.0
4:25:08 AM: Usage of /: 73.4% of 217.51GB
4:25:08 AM: Memory usage: 20%
4:25:08 AM: Swap usage: 1%
4:25:08 AM: Processes: 590
4:25:08 AM: Users logged in: 1
4:25:08 AM: IPv4 address for bond0: 147.75.199.15
4:25:08 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:25:08 AM: IPv4 address for docker0: 172.17.0.1
4:25:08 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:25:08 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:25:08 AM: 82 updates can be applied immediately.
4:25:08 AM: 7 of these updates are standard security updates.
4:25:08 AM: To see these additional updates run: apt list --upgradable
4:25:08 AM: New release '22.04.5 LTS' available.
4:25:08 AM: Run 'do-release-upgrade' to upgrade to it.
4:25:08 AM: 2 updates could not be installed automatically. For more details,
4:25:08 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:25:08 AM: *** System restart required ***
4:25:08 AM: Cloning into 'packageRemote'...
4:25:08 AM: node version: v18.3
4:25:09 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:25:09 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:25:09 AM: * Documentation: https://help.ubuntu.com
4:25:09 AM: * Management: https://landscape.canonical.com
4:25:09 AM: * Support: https://ubuntu.com/advantage
4:25:09 AM: System information as of Thu Feb 13 04:25:09 UTC 2025
4:25:09 AM: System load: 0.0
4:25:09 AM: Usage of /: 73.5% of 217.51GB
4:25:09 AM: Memory usage: 20%
4:25:09 AM: Swap usage: 1%
4:25:09 AM: Processes: 598
4:25:09 AM: Users logged in: 1
4:25:09 AM: IPv4 address for bond0: 147.75.199.15
4:25:09 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:25:09 AM: IPv4 address for docker0: 172.17.0.1
4:25:09 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:25:09 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:25:09 AM: 82 updates can be applied immediately.
4:25:09 AM: 7 of these updates are standard security updates.
4:25:09 AM: To see these additional updates run: apt list --upgradable
4:25:09 AM: New release '22.04.5 LTS' available.
4:25:09 AM: Run 'do-release-upgrade' to upgrade to it.
4:25:09 AM: 2 updates could not be installed automatically. For more details,
4:25:09 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:25:09 AM: *** System restart required ***
4:25:09 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:25:09 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:25:09 AM: * Documentation: https://help.ubuntu.com
4:25:09 AM: * Management: https://landscape.canonical.com
4:25:09 AM: * Support: https://ubuntu.com/advantage
4:25:09 AM: System information as of Thu Feb 13 04:25:09 UTC 2025
4:25:09 AM: System load: 0.0
4:25:09 AM: Usage of /: 73.5% of 217.51GB
4:25:09 AM: Memory usage: 20%
4:25:09 AM: Swap usage: 1%
4:25:09 AM: Processes: 598
4:25:09 AM: Users logged in: 1
4:25:09 AM: IPv4 address for bond0: 147.75.199.15
4:25:09 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:25:09 AM: IPv4 address for docker0: 172.17.0.1
4:25:09 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:25:09 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:25:09 AM: 82 updates can be applied immediately.
4:25:09 AM: 7 of these updates are standard security updates.
4:25:09 AM: To see these additional updates run: apt list --upgradable
4:25:09 AM: New release '22.04.5 LTS' available.
4:25:09 AM: Run 'do-release-upgrade' to upgrade to it.
4:25:09 AM: 2 updates could not be installed automatically. For more details,
4:25:09 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:25:09 AM: *** System restart required ***
4:25:09 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:25:10 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:25:10 AM: * Documentation: https://help.ubuntu.com
4:25:10 AM: * Management: https://landscape.canonical.com
4:25:10 AM: * Support: https://ubuntu.com/advantage
4:25:10 AM: System information as of Thu Feb 13 04:25:10 UTC 2025
4:25:10 AM: System load: 0.08
4:25:10 AM: Usage of /: 73.5% of 217.51GB
4:25:10 AM: Memory usage: 20%
4:25:10 AM: Swap usage: 1%
4:25:10 AM: Processes: 606
4:25:10 AM: Users logged in: 1
4:25:10 AM: IPv4 address for bond0: 147.75.199.15
4:25:10 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:25:10 AM: IPv4 address for docker0: 172.17.0.1
4:25:10 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:25:10 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:25:10 AM: 82 updates can be applied immediately.
4:25:10 AM: 7 of these updates are standard security updates.
4:25:10 AM: To see these additional updates run: apt list --upgradable
4:25:10 AM: New release '22.04.5 LTS' available.
4:25:10 AM: Run 'do-release-upgrade' to upgrade to it.
4:25:10 AM: 2 updates could not be installed automatically. For more details,
4:25:10 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:25:10 AM: *** System restart required ***
4:25:10 AM: focal: Pulling from netlify/build
4:25:10 AM: d9802f032d67: Already exists
4:25:10 AM: 148a94032b4d: Already exists
4:25:10 AM: 7b71ded1a237: Pulling fs layer
4:25:10 AM: ef9bba4f8bbc: Pulling fs layer
4:25:10 AM: 81c086ee33cf: Pulling fs layer
4:25:10 AM: fbbf5a73fb1f: Pulling fs layer
4:25:10 AM: e5fa135336b7: Pulling fs layer
4:25:10 AM: ef9bba4f8bbc: Download complete
4:25:10 AM: a0b6e8edf139: Pulling fs layer
4:25:10 AM: 2e7eeab36566: Pulling fs layer
4:25:10 AM: 81c086ee33cf: Download complete
4:25:10 AM: c620b1383875: Pulling fs layer
4:25:10 AM: af77ffc2ca99: Pulling fs layer
4:25:10 AM: a0b6e8edf139: Download complete
4:25:10 AM: c620b1383875: Waiting
4:25:10 AM: dbafaacb3a8f: Pulling fs layer
4:25:10 AM: d07364f7b55a: Pulling fs layer
4:25:10 AM: af77ffc2ca99: Waiting
4:25:10 AM: dbafaacb3a8f: Waiting
4:25:10 AM: 5b38bebb155b: Pulling fs layer
4:25:10 AM: f5f8b11e3e6f: Pulling fs layer
4:25:10 AM: d07364f7b55a: Waiting
4:25:10 AM: fe1d21b79701: Pulling fs layer
4:25:10 AM: 5b38bebb155b: Waiting
4:25:10 AM: 405dc4173f88: Pulling fs layer
4:25:10 AM: f5f8b11e3e6f: Waiting
4:25:10 AM: 45ffd735976e: Pulling fs layer
4:25:10 AM: fe1d21b79701: Waiting
4:25:10 AM: dbd7aebca080: Pulling fs layer
4:25:10 AM: 405dc4173f88: Waiting
4:25:10 AM: a331ffc521f1: Pulling fs layer
4:25:10 AM: 45ffd735976e: Waiting
4:25:10 AM: c9a3788613b7: Pulling fs layer
4:25:10 AM: e6f67f269301: Pulling fs layer
4:25:10 AM: dbd7aebca080: Waiting
4:25:10 AM: 16068370ce69: Pulling fs layer
4:25:10 AM: b02b24c01ca9: Pulling fs layer
4:25:10 AM: a331ffc521f1: Waiting
4:25:10 AM: e6f67f269301: Waiting
4:25:10 AM: 16068370ce69: Waiting
4:25:10 AM: c9a3788613b7: Waiting
4:25:10 AM: cb0bad2154a2: Pulling fs layer
4:25:10 AM: 88eacf212ac9: Pulling fs layer
4:25:10 AM: 9d6f03e34d87: Pulling fs layer
4:25:10 AM: b02b24c01ca9: Waiting
4:25:10 AM: d93ef7b91d51: Pulling fs layer
4:25:10 AM: cb0bad2154a2: Waiting
4:25:10 AM: c1ec5e9f6e06: Pulling fs layer
4:25:10 AM: 88eacf212ac9: Waiting
4:25:10 AM: f14cafbdea11: Pulling fs layer
4:25:10 AM: 9d6f03e34d87: Waiting
4:25:10 AM: 251fe5b81f6c: Pulling fs layer
4:25:10 AM: baab14fc9730: Pulling fs layer
4:25:10 AM: d93ef7b91d51: Waiting
4:25:10 AM: 690b93a4876f: Pulling fs layer
4:25:10 AM: f14cafbdea11: Waiting
4:25:10 AM: 251fe5b81f6c: Waiting
4:25:10 AM: c1ec5e9f6e06: Waiting
4:25:10 AM: 14c05d70b665: Pulling fs layer
4:25:10 AM: baab14fc9730: Waiting
4:25:10 AM: 9ef736a9d750: Pulling fs layer
4:25:10 AM: 690b93a4876f: Waiting
4:25:10 AM: d9b8c1db03bb: Pulling fs layer
4:25:10 AM: 14c05d70b665: Waiting
4:25:10 AM: af19fe5a7a2a: Pulling fs layer
4:25:10 AM: cd2ccc985c5e: Pulling fs layer
4:25:10 AM: 4f4fb700ef54: Pulling fs layer
4:25:10 AM: efc6b19309ea: Pulling fs layer
4:25:10 AM: fee86aa2c1f9: Pulling fs layer
4:25:10 AM: d9b8c1db03bb: Waiting
4:25:10 AM: 9ef736a9d750: Waiting
4:25:10 AM: 861df937547a: Pulling fs layer
4:25:10 AM: 4f4fb700ef54: Waiting
4:25:10 AM: a008954a8ad0: Pulling fs layer
4:25:10 AM: cd2ccc985c5e: Waiting
4:25:10 AM: fee86aa2c1f9: Waiting
4:25:10 AM: af19fe5a7a2a: Waiting
4:25:10 AM: ed0a9334c1dd: Pulling fs layer
4:25:10 AM: 44004c9ad1eb: Pulling fs layer
4:25:10 AM: efc6b19309ea: Waiting
4:25:10 AM: 86e21bc0f82c: Pulling fs layer
4:25:10 AM: 861df937547a: Waiting
4:25:10 AM: ed0a9334c1dd: Waiting
4:25:10 AM: 6d8397ea31ac: Pulling fs layer
4:25:10 AM: 44004c9ad1eb: Waiting
4:25:10 AM: 16d5840c609c: Pulling fs layer
4:25:10 AM: a008954a8ad0: Waiting
4:25:10 AM: 86e21bc0f82c: Waiting
4:25:10 AM: 376c147b1846: Pulling fs layer
4:25:10 AM: 6d8397ea31ac: Waiting
4:25:10 AM: 239f618f2885: Pulling fs layer
4:25:10 AM: 16d5840c609c: Waiting
4:25:10 AM: 12974b2eb650: Pulling fs layer
4:25:10 AM: 376c147b1846: Waiting
4:25:10 AM: 239f618f2885: Waiting
4:25:10 AM: 12974b2eb650: Waiting
4:25:10 AM: 2e7eeab36566: Verifying Checksum
4:25:10 AM: 2e7eeab36566: Download complete
4:25:11 AM: 7b71ded1a237: Pull complete
4:25:11 AM: ef9bba4f8bbc: Pull complete
4:25:11 AM: c620b1383875: Download complete
4:25:11 AM: 81c086ee33cf: Pull complete
4:25:11 AM: e5fa135336b7: Verifying Checksum
4:25:11 AM: e5fa135336b7: Download complete
4:25:12 AM: dbafaacb3a8f: Verifying Checksum
4:25:12 AM: dbafaacb3a8f: Download complete
4:25:13 AM: d07364f7b55a: Download complete
4:25:15 AM: fbbf5a73fb1f: Verifying Checksum
4:25:15 AM: fbbf5a73fb1f: Download complete
4:25:16 AM: f5f8b11e3e6f: Verifying Checksum
4:25:16 AM: f5f8b11e3e6f: Download complete
4:25:16 AM: af77ffc2ca99: Verifying Checksum
4:25:16 AM: af77ffc2ca99: Download complete
4:25:16 AM: fe1d21b79701: Download complete
4:25:16 AM: 405dc4173f88: Verifying Checksum
4:25:16 AM: 405dc4173f88: Download complete
4:25:17 AM: dbd7aebca080: Verifying Checksum
4:25:17 AM: dbd7aebca080: Download complete
4:25:17 AM: a331ffc521f1: Verifying Checksum
4:25:17 AM: a331ffc521f1: Download complete
4:25:18 AM: 45ffd735976e: Verifying Checksum
4:25:18 AM: 45ffd735976e: Download complete
4:25:18 AM: e6f67f269301: Verifying Checksum
4:25:18 AM: e6f67f269301: Download complete
4:25:19 AM: 16068370ce69: Verifying Checksum
4:25:19 AM: 16068370ce69: Download complete
4:25:20 AM: b02b24c01ca9: Verifying Checksum
4:25:20 AM: b02b24c01ca9: Download complete
4:25:21 AM: c9a3788613b7: Verifying Checksum
4:25:21 AM: c9a3788613b7: Download complete
4:25:21 AM: cb0bad2154a2: Verifying Checksum
4:25:21 AM: cb0bad2154a2: Download complete
4:25:23 AM: 88eacf212ac9: Verifying Checksum
4:25:23 AM: 88eacf212ac9: Download complete
4:25:23 AM: d93ef7b91d51: Verifying Checksum
4:25:23 AM: d93ef7b91d51: Download complete
4:25:23 AM: c1ec5e9f6e06: Verifying Checksum
4:25:23 AM: c1ec5e9f6e06: Download complete
4:25:24 AM: 5b38bebb155b: Download complete
4:25:25 AM: 251fe5b81f6c: Verifying Checksum
4:25:25 AM: 251fe5b81f6c: Download complete
4:25:26 AM: f14cafbdea11: Verifying Checksum
4:25:26 AM: f14cafbdea11: Download complete
4:25:26 AM: 9d6f03e34d87: Verifying Checksum
4:25:26 AM: 9d6f03e34d87: Download complete
4:25:26 AM: 690b93a4876f: Download complete
4:25:26 AM: 14c05d70b665: Download complete
4:25:26 AM: 9ef736a9d750: Verifying Checksum
4:25:26 AM: 9ef736a9d750: Download complete
4:25:27 AM: d9b8c1db03bb: Download complete
4:25:27 AM: af19fe5a7a2a: Verifying Checksum
4:25:27 AM: af19fe5a7a2a: Download complete
4:25:27 AM: 4f4fb700ef54: Download complete
4:25:28 AM: baab14fc9730: Verifying Checksum
4:25:28 AM: baab14fc9730: Download complete
4:25:28 AM: fee86aa2c1f9: Download complete
4:25:28 AM: 861df937547a: Verifying Checksum
4:25:28 AM: 861df937547a: Download complete
4:25:28 AM: a008954a8ad0: Verifying Checksum
4:25:28 AM: a008954a8ad0: Download complete
4:25:29 AM: fbbf5a73fb1f: Pull complete
4:25:30 AM: e5fa135336b7: Pull complete
4:25:31 AM: a0b6e8edf139: Pull complete
4:25:31 AM: 2e7eeab36566: Pull complete
4:25:31 AM: c620b1383875: Pull complete
4:25:33 AM: cd2ccc985c5e: Verifying Checksum
4:25:33 AM: cd2ccc985c5e: Download complete
4:25:34 AM: 44004c9ad1eb: Verifying Checksum
4:25:34 AM: 44004c9ad1eb: Download complete
4:25:34 AM: af77ffc2ca99: Pull complete
4:25:34 AM: 86e21bc0f82c: Download complete
4:25:34 AM: dbafaacb3a8f: Pull complete
4:25:35 AM: d07364f7b55a: Pull complete
4:25:35 AM: 6d8397ea31ac: Download complete
4:25:35 AM: 16d5840c609c: Verifying Checksum
4:25:35 AM: 16d5840c609c: Download complete
4:25:35 AM: 376c147b1846: Verifying Checksum
4:25:35 AM: 376c147b1846: Download complete
4:25:35 AM: 239f618f2885: Verifying Checksum
4:25:35 AM: 239f618f2885: Download complete
4:25:35 AM: 12974b2eb650: Verifying Checksum
4:25:35 AM: 12974b2eb650: Download complete
4:25:36 AM: 5b38bebb155b: Pull complete
4:25:36 AM: f5f8b11e3e6f: Pull complete
4:25:36 AM: fe1d21b79701: Pull complete
4:25:36 AM: 405dc4173f88: Pull complete
4:25:37 AM: efc6b19309ea: Verifying Checksum
4:25:37 AM: efc6b19309ea: Download complete
4:25:37 AM: 45ffd735976e: Pull complete
4:25:37 AM: dbd7aebca080: Pull complete
4:25:38 AM: a331ffc521f1: Pull complete
4:25:39 AM: ed0a9334c1dd: Verifying Checksum
4:25:39 AM: ed0a9334c1dd: Download complete
4:25:40 AM: c9a3788613b7: Pull complete
4:25:40 AM: e6f67f269301: Pull complete
4:25:41 AM: 16068370ce69: Pull complete
4:25:41 AM: b02b24c01ca9: Pull complete
4:25:42 AM: cb0bad2154a2: Pull complete
4:25:42 AM: 88eacf212ac9: Pull complete
4:25:45 AM: 9d6f03e34d87: Pull complete
4:25:45 AM: d93ef7b91d51: Pull complete
4:25:45 AM: c1ec5e9f6e06: Pull complete
4:25:45 AM: f14cafbdea11: Pull complete
4:25:46 AM: 251fe5b81f6c: Pull complete
4:25:46 AM: baab14fc9730: Pull complete
4:25:46 AM: 690b93a4876f: Pull complete
4:25:46 AM: 14c05d70b665: Pull complete
4:25:46 AM: 9ef736a9d750: Pull complete
4:25:46 AM: d9b8c1db03bb: Pull complete
4:25:46 AM: af19fe5a7a2a: Pull complete
4:25:51 AM: cd2ccc985c5e: Pull complete
4:25:51 AM: 4f4fb700ef54: Pull complete
4:25:54 AM: efc6b19309ea: Pull complete
4:25:54 AM: fee86aa2c1f9: Pull complete
4:25:54 AM: 861df937547a: Pull complete
4:25:54 AM: a008954a8ad0: Pull complete
4:25:56 AM: ed0a9334c1dd: Pull complete
4:25:56 AM: 44004c9ad1eb: Pull complete
4:25:57 AM: 86e21bc0f82c: Pull complete
4:25:57 AM: 6d8397ea31ac: Pull complete
4:25:57 AM: 16d5840c609c: Pull complete
4:25:57 AM: 376c147b1846: Pull complete
4:25:57 AM: 239f618f2885: Pull complete
4:25:57 AM: 12974b2eb650: Pull complete
4:25:57 AM: Digest: sha256:a53361ff11a8e42c6088aa85a401da4ac76b8c9bed96731e2f85536028417ef1
4:25:57 AM: Status: Downloaded newer image for netlify/build:focal
4:25:57 AM: docker.io/netlify/build:focal
4:25:57 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:25:57 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:25:57 AM: * Documentation: https://help.ubuntu.com
4:25:57 AM: * Management: https://landscape.canonical.com
4:25:57 AM: * Support: https://ubuntu.com/advantage
4:25:57 AM: System information as of Thu Feb 13 04:25:10 UTC 2025
4:25:57 AM: System load: 0.08
4:25:57 AM: Usage of /: 73.5% of 217.51GB
4:25:57 AM: Memory usage: 20%
4:25:57 AM: Swap usage: 1%
4:25:57 AM: Processes: 606
4:25:57 AM: Users logged in: 1
4:25:57 AM: IPv4 address for bond0: 147.75.199.15
4:25:57 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:25:57 AM: IPv4 address for docker0: 172.17.0.1
4:25:57 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:25:57 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:25:57 AM: 82 updates can be applied immediately.
4:25:57 AM: 7 of these updates are standard security updates.
4:25:57 AM: To see these additional updates run: apt list --upgradable
4:25:57 AM: New release '22.04.5 LTS' available.
4:25:57 AM: Run 'do-release-upgrade' to upgrade to it.
4:25:57 AM: 2 updates could not be installed automatically. For more details,
4:25:57 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:25:57 AM: *** System restart required ***
4:25:57 AM: focal: Pulling from netlify/build
4:25:57 AM: d9802f032d67: Already exists
4:25:57 AM: 148a94032b4d: Already exists
4:25:57 AM: 7b71ded1a237: Pulling fs layer
4:25:57 AM: ef9bba4f8bbc: Pulling fs layer
4:25:57 AM: 81c086ee33cf: Pulling fs layer
4:25:57 AM: fbbf5a73fb1f: Pulling fs layer
4:25:57 AM: e5fa135336b7: Pulling fs layer
4:25:57 AM: ef9bba4f8bbc: Download complete
4:25:57 AM: a0b6e8edf139: Pulling fs layer
4:25:57 AM: 2e7eeab36566: Pulling fs layer
4:25:57 AM: 81c086ee33cf: Download complete
4:25:57 AM: c620b1383875: Pulling fs layer
4:25:57 AM: af77ffc2ca99: Pulling fs layer
4:25:57 AM: a0b6e8edf139: Download complete
4:25:57 AM: c620b1383875: Waiting
4:25:57 AM: dbafaacb3a8f: Pulling fs layer
4:25:57 AM: d07364f7b55a: Pulling fs layer
4:25:57 AM: af77ffc2ca99: Waiting
4:25:57 AM: dbafaacb3a8f: Waiting
4:25:57 AM: 5b38bebb155b: Pulling fs layer
4:25:57 AM: f5f8b11e3e6f: Pulling fs layer
4:25:57 AM: d07364f7b55a: Waiting
4:25:57 AM: fe1d21b79701: Pulling fs layer
4:25:57 AM: 5b38bebb155b: Waiting
4:25:57 AM: 405dc4173f88: Pulling fs layer
4:25:57 AM: f5f8b11e3e6f: Waiting
4:25:57 AM: 45ffd735976e: Pulling fs layer
4:25:57 AM: fe1d21b79701: Waiting
4:25:57 AM: dbd7aebca080: Pulling fs layer
4:25:57 AM: 405dc4173f88: Waiting
4:25:57 AM: a331ffc521f1: Pulling fs layer
4:25:57 AM: 45ffd735976e: Waiting
4:25:57 AM: c9a3788613b7: Pulling fs layer
4:25:57 AM: e6f67f269301: Pulling fs layer
4:25:57 AM: dbd7aebca080: Waiting
4:25:57 AM: 16068370ce69: Pulling fs layer
4:25:57 AM: b02b24c01ca9: Pulling fs layer
4:25:57 AM: a331ffc521f1: Waiting
4:25:57 AM: e6f67f269301: Waiting
4:25:57 AM: 16068370ce69: Waiting
4:25:57 AM: c9a3788613b7: Waiting
4:25:57 AM: cb0bad2154a2: Pulling fs layer
4:25:57 AM: 88eacf212ac9: Pulling fs layer
4:25:57 AM: 9d6f03e34d87: Pulling fs layer
4:25:57 AM: b02b24c01ca9: Waiting
4:25:57 AM: d93ef7b91d51: Pulling fs layer
4:25:57 AM: cb0bad2154a2: Waiting
4:25:57 AM: c1ec5e9f6e06: Pulling fs layer
4:25:57 AM: 88eacf212ac9: Waiting
4:25:57 AM: f14cafbdea11: Pulling fs layer
4:25:57 AM: 9d6f03e34d87: Waiting
4:25:57 AM: 251fe5b81f6c: Pulling fs layer
4:25:57 AM: baab14fc9730: Pulling fs layer
4:25:57 AM: d93ef7b91d51: Waiting
4:25:57 AM: 690b93a4876f: Pulling fs layer
4:25:57 AM: f14cafbdea11: Waiting
4:25:57 AM: 251fe5b81f6c: Waiting
4:25:57 AM: c1ec5e9f6e06: Waiting
4:25:57 AM: 14c05d70b665: Pulling fs layer
4:25:57 AM: baab14fc9730: Waiting
4:25:57 AM: 9ef736a9d750: Pulling fs layer
4:25:57 AM: 690b93a4876f: Waiting
4:25:57 AM: d9b8c1db03bb: Pulling fs layer
4:25:57 AM: 14c05d70b665: Waiting
4:25:57 AM: af19fe5a7a2a: Pulling fs layer
4:25:57 AM: cd2ccc985c5e: Pulling fs layer
4:25:57 AM: 4f4fb700ef54: Pulling fs layer
4:25:57 AM: efc6b19309ea: Pulling fs layer
4:25:57 AM: fee86aa2c1f9: Pulling fs layer
4:25:57 AM: d9b8c1db03bb: Waiting
4:25:57 AM: 9ef736a9d750: Waiting
4:25:57 AM: 861df937547a: Pulling fs layer
4:25:57 AM: 4f4fb700ef54: Waiting
4:25:57 AM: a008954a8ad0: Pulling fs layer
4:25:57 AM: cd2ccc985c5e: Waiting
4:25:57 AM: fee86aa2c1f9: Waiting
4:25:57 AM: af19fe5a7a2a: Waiting
4:25:57 AM: ed0a9334c1dd: Pulling fs layer
4:25:57 AM: 44004c9ad1eb: Pulling fs layer
4:25:57 AM: efc6b19309ea: Waiting
4:25:57 AM: 86e21bc0f82c: Pulling fs layer
4:25:57 AM: 861df937547a: Waiting
4:25:57 AM: ed0a9334c1dd: Waiting
4:25:57 AM: 6d8397ea31ac: Pulling fs layer
4:25:57 AM: 44004c9ad1eb: Waiting
4:25:57 AM: 16d5840c609c: Pulling fs layer
4:25:57 AM: a008954a8ad0: Waiting
4:25:57 AM: 86e21bc0f82c: Waiting
4:25:57 AM: 376c147b1846: Pulling fs layer
4:25:57 AM: 6d8397ea31ac: Waiting
4:25:57 AM: 239f618f2885: Pulling fs layer
4:25:57 AM: 16d5840c609c: Waiting
4:25:57 AM: 12974b2eb650: Pulling fs layer
4:25:57 AM: 376c147b1846: Waiting
4:25:57 AM: 239f618f2885: Waiting
4:25:57 AM: 12974b2eb650: Waiting
4:25:57 AM: 2e7eeab36566: Verifying Checksum
4:25:57 AM: 2e7eeab36566: Download complete
4:25:57 AM: 7b71ded1a237: Pull complete
4:25:57 AM: ef9bba4f8bbc: Pull complete
4:25:57 AM: c620b1383875: Download complete
4:25:57 AM: 81c086ee33cf: Pull complete
4:25:57 AM: e5fa135336b7: Verifying Checksum
4:25:57 AM: e5fa135336b7: Download complete
4:25:57 AM: dbafaacb3a8f: Verifying Checksum
4:25:57 AM: dbafaacb3a8f: Download complete
4:25:57 AM: d07364f7b55a: Download complete
4:25:57 AM: fbbf5a73fb1f: Verifying Checksum
4:25:57 AM: fbbf5a73fb1f: Download complete
4:25:57 AM: f5f8b11e3e6f: Verifying Checksum
4:25:57 AM: f5f8b11e3e6f: Download complete
4:25:57 AM: af77ffc2ca99: Verifying Checksum
4:25:57 AM: af77ffc2ca99: Download complete
4:25:57 AM: fe1d21b79701: Download complete
4:25:57 AM: 405dc4173f88: Verifying Checksum
4:25:57 AM: 405dc4173f88: Download complete
4:25:57 AM: dbd7aebca080: Verifying Checksum
4:25:57 AM: dbd7aebca080: Download complete
4:25:57 AM: a331ffc521f1: Verifying Checksum
4:25:57 AM: a331ffc521f1: Download complete
4:25:57 AM: 45ffd735976e: Verifying Checksum
4:25:57 AM: 45ffd735976e: Download complete
4:25:57 AM: e6f67f269301: Verifying Checksum
4:25:57 AM: e6f67f269301: Download complete
4:25:57 AM: 16068370ce69: Verifying Checksum
4:25:57 AM: 16068370ce69: Download complete
4:25:57 AM: b02b24c01ca9: Verifying Checksum
4:25:57 AM: b02b24c01ca9: Download complete
4:25:57 AM: c9a3788613b7: Verifying Checksum
4:25:57 AM: c9a3788613b7: Download complete
4:25:57 AM: cb0bad2154a2: Verifying Checksum
4:25:57 AM: cb0bad2154a2: Download complete
4:25:57 AM: 88eacf212ac9: Verifying Checksum
4:25:57 AM: 88eacf212ac9: Download complete
4:25:58 AM: d93ef7b91d51: Verifying Checksum
4:25:58 AM: d93ef7b91d51: Download complete
4:25:58 AM: c1ec5e9f6e06: Verifying Checksum
4:25:58 AM: c1ec5e9f6e06: Download complete
4:25:58 AM: 5b38bebb155b: Download complete
4:25:58 AM: 251fe5b81f6c: Verifying Checksum
4:25:58 AM: 251fe5b81f6c: Download complete
4:25:58 AM: f14cafbdea11: Verifying Checksum
4:25:58 AM: f14cafbdea11: Download complete
4:25:58 AM: 9d6f03e34d87: Verifying Checksum
4:25:58 AM: 9d6f03e34d87: Download complete
4:25:58 AM: 690b93a4876f: Download complete
4:25:58 AM: 14c05d70b665: Download complete
4:25:58 AM: 9ef736a9d750: Verifying Checksum
4:25:58 AM: 9ef736a9d750: Download complete
4:25:58 AM: d9b8c1db03bb: Download complete
4:25:58 AM: af19fe5a7a2a: Verifying Checksum
4:25:58 AM: af19fe5a7a2a: Download complete
4:25:58 AM: 4f4fb700ef54: Download complete
4:25:58 AM: baab14fc9730: Verifying Checksum
4:25:58 AM: baab14fc9730: Download complete
4:25:58 AM: fee86aa2c1f9: Download complete
4:25:58 AM: 861df937547a: Verifying Checksum
4:25:58 AM: 861df937547a: Download complete
4:25:58 AM: a008954a8ad0: Verifying Checksum
4:25:58 AM: a008954a8ad0: Download complete
4:25:58 AM: fbbf5a73fb1f: Pull complete
4:25:58 AM: e5fa135336b7: Pull complete
4:25:58 AM: a0b6e8edf139: Pull complete
4:25:58 AM: 2e7eeab36566: Pull complete
4:25:58 AM: c620b1383875: Pull complete
4:25:58 AM: cd2ccc985c5e: Verifying Checksum
4:25:58 AM: cd2ccc985c5e: Download complete
4:25:58 AM: 44004c9ad1eb: Verifying Checksum
4:25:58 AM: 44004c9ad1eb: Download complete
4:25:58 AM: af77ffc2ca99: Pull complete
4:25:58 AM: 86e21bc0f82c: Download complete
4:25:58 AM: dbafaacb3a8f: Pull complete
4:25:58 AM: d07364f7b55a: Pull complete
4:25:58 AM: 6d8397ea31ac: Download complete
4:25:58 AM: 16d5840c609c: Verifying Checksum
4:25:58 AM: 16d5840c609c: Download complete
4:25:58 AM: 376c147b1846: Verifying Checksum
4:25:58 AM: 376c147b1846: Download complete
4:25:58 AM: 239f618f2885: Verifying Checksum
4:25:58 AM: 239f618f2885: Download complete
4:25:58 AM: 12974b2eb650: Verifying Checksum
4:25:58 AM: 12974b2eb650: Download complete
4:25:58 AM: 5b38bebb155b: Pull complete
4:25:58 AM: f5f8b11e3e6f: Pull complete
4:25:58 AM: fe1d21b79701: Pull complete
4:25:58 AM: 405dc4173f88: Pull complete
4:25:58 AM: efc6b19309ea: Verifying Checksum
4:25:58 AM: efc6b19309ea: Download complete
4:25:58 AM: 45ffd735976e: Pull complete
4:25:58 AM: dbd7aebca080: Pull complete
4:25:58 AM: a331ffc521f1: Pull complete
4:25:58 AM: ed0a9334c1dd: Verifying Checksum
4:25:58 AM: ed0a9334c1dd: Download complete
4:25:58 AM: c9a3788613b7: Pull complete
4:25:58 AM: e6f67f269301: Pull complete
4:25:58 AM: 16068370ce69: Pull complete
4:25:58 AM: b02b24c01ca9: Pull complete
4:25:58 AM: cb0bad2154a2: Pull complete
4:25:58 AM: 88eacf212ac9: Pull complete
4:25:58 AM: 9d6f03e34d87: Pull complete
4:25:58 AM: d93ef7b91d51: Pull complete
4:25:58 AM: c1ec5e9f6e06: Pull complete
4:25:58 AM: f14cafbdea11: Pull complete
4:25:58 AM: 251fe5b81f6c: Pull complete
4:25:58 AM: baab14fc9730: Pull complete
4:25:58 AM: 690b93a4876f: Pull complete
4:25:58 AM: 14c05d70b665: Pull complete
4:25:58 AM: 9ef736a9d750: Pull complete
4:25:58 AM: d9b8c1db03bb: Pull complete
4:25:58 AM: af19fe5a7a2a: Pull complete
4:25:58 AM: cd2ccc985c5e: Pull complete
4:25:58 AM: 4f4fb700ef54: Pull complete
4:25:58 AM: efc6b19309ea: Pull complete
4:25:58 AM: fee86aa2c1f9: Pull complete
4:25:58 AM: 861df937547a: Pull complete
4:25:58 AM: a008954a8ad0: Pull complete
4:25:58 AM: ed0a9334c1dd: Pull complete
4:25:58 AM: 44004c9ad1eb: Pull complete
4:25:58 AM: 86e21bc0f82c: Pull complete
4:25:58 AM: 6d8397ea31ac: Pull complete
4:25:58 AM: 16d5840c609c: Pull complete
4:25:58 AM: 376c147b1846: Pull complete
4:25:58 AM: 239f618f2885: Pull complete
4:25:58 AM: 12974b2eb650: Pull complete
4:25:58 AM: Digest: sha256:a53361ff11a8e42c6088aa85a401da4ac76b8c9bed96731e2f85536028417ef1
4:25:58 AM: Status: Downloaded newer image for netlify/build:focal
4:25:58 AM: docker.io/netlify/build:focal
4:26:02 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:26:02 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:26:02 AM: * Documentation: https://help.ubuntu.com
4:26:02 AM: * Management: https://landscape.canonical.com
4:26:02 AM: * Support: https://ubuntu.com/advantage
4:26:02 AM: System information as of Thu Feb 13 04:26:02 UTC 2025
4:26:02 AM: System load: 1.69
4:26:02 AM: Usage of /: 75.8% of 217.51GB
4:26:02 AM: Memory usage: 20%
4:26:02 AM: Swap usage: 1%
4:26:02 AM: Processes: 637
4:26:02 AM: Users logged in: 1
4:26:02 AM: IPv4 address for bond0: 147.75.199.15
4:26:02 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:26:02 AM: IPv4 address for docker0: 172.17.0.1
4:26:02 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:26:02 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:26:02 AM: 82 updates can be applied immediately.
4:26:02 AM: 7 of these updates are standard security updates.
4:26:02 AM: To see these additional updates run: apt list --upgradable
4:26:02 AM: New release '22.04.5 LTS' available.
4:26:02 AM: Run 'do-release-upgrade' to upgrade to it.
4:26:02 AM: 2 updates could not be installed automatically. For more details,
4:26:02 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:26:02 AM: *** System restart required ***
4:26:02 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:26:02 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:26:02 AM: * Documentation: https://help.ubuntu.com
4:26:02 AM: * Management: https://landscape.canonical.com
4:26:02 AM: * Support: https://ubuntu.com/advantage
4:26:02 AM: System information as of Thu Feb 13 04:26:02 UTC 2025
4:26:02 AM: System load: 1.69
4:26:02 AM: Usage of /: 75.8% of 217.51GB
4:26:02 AM: Memory usage: 20%
4:26:02 AM: Swap usage: 1%
4:26:02 AM: Processes: 637
4:26:02 AM: Users logged in: 1
4:26:02 AM: IPv4 address for bond0: 147.75.199.15
4:26:02 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:26:02 AM: IPv4 address for docker0: 172.17.0.1
4:26:02 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:26:02 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:26:02 AM: 82 updates can be applied immediately.
4:26:02 AM: 7 of these updates are standard security updates.
4:26:02 AM: To see these additional updates run: apt list --upgradable
4:26:02 AM: New release '22.04.5 LTS' available.
4:26:02 AM: Run 'do-release-upgrade' to upgrade to it.
4:26:02 AM: 2 updates could not be installed automatically. For more details,
4:26:02 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:26:02 AM: *** System restart required ***
4:26:03 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:26:03 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:26:03 AM: * Documentation: https://help.ubuntu.com
4:26:03 AM: * Management: https://landscape.canonical.com
4:26:03 AM: * Support: https://ubuntu.com/advantage
4:26:03 AM: System information as of Thu Feb 13 04:26:03 UTC 2025
4:26:03 AM: System load: 1.69
4:26:03 AM: Usage of /: 75.8% of 217.51GB
4:26:03 AM: Memory usage: 20%
4:26:03 AM: Swap usage: 1%
4:26:03 AM: Processes: 641
4:26:03 AM: Users logged in: 1
4:26:03 AM: IPv4 address for bond0: 147.75.199.15
4:26:03 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:26:03 AM: IPv4 address for docker0: 172.17.0.1
4:26:03 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:26:03 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:26:03 AM: 82 updates can be applied immediately.
4:26:03 AM: 7 of these updates are standard security updates.
4:26:03 AM: To see these additional updates run: apt list --upgradable
4:26:03 AM: New release '22.04.5 LTS' available.
4:26:03 AM: Run 'do-release-upgrade' to upgrade to it.
4:26:03 AM: 2 updates could not be installed automatically. For more details,
4:26:03 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:26:03 AM: *** System restart required ***
4:26:16 AM: /opt/buildhome/.nvm/nvm.sh
4:26:16 AM: .:
4:26:16 AM: ADOPTERS.md jest.config.js netlify.toml tools
4:26:16 AM: bin landscapes_dev package.json update_server
4:26:16 AM: build.sh landscapes.sh postcss.config.js _yarn.lock
4:26:16 AM: code-of-conduct.md landscapes.yml README.md yarn.lock
4:26:16 AM: files LICENSE server.js
4:26:16 AM: _headers netlify specs
4:26:16 AM: INSTALL.md netlify.md src
4:26:16 AM: v18.3
4:26:17 AM: Downloading and installing node v18.3.0...
4:26:18 AM: Computing checksum with sha256sum
4:26:18 AM: Checksums matched!
4:26:21 AM: Now using node v18.3.0 (npm v8.11.0)
4:26:21 AM: Now using node v18.3.0 (npm v8.11.0)
4:26:22 AM: npm WARN config global `--global`, `--local` are deprecated. Use `--location=global` instead.
4:26:22 AM:
4:26:22 AM: added 3 packages, and audited 4 packages in 510ms
4:26:22 AM: found 0 vulnerabilities
4:26:22 AM: npm
4:26:22 AM: WARN config global `--global`, `--local` are deprecated. Use `--location=global` instead.
4:26:22 AM:
4:26:25 AM: removed 12 packages, changed 105 packages, and audited 263 packages in 2s
4:26:25 AM: 27 packages are looking for funding
4:26:25 AM: run `npm fund` for details
4:26:25 AM: found 0 vulnerabilities
4:26:26 AM: added 1 package in 1s
4:26:27 AM: YN0000: ┌ Resolution step
4:26:27 AM: YN0000: └ Completed
4:26:27 AM: YN0000: ┌ Fetch step
4:26:33 AM: YN0013: │ 2 packages were already cached, 808 had to be fetched
4:26:33 AM: YN0000: └ Completed in 6s 214ms
4:26:33 AM: YN0000: ┌ Link step
4:26:34 AM: YN0000: │ ESM support for PnP uses the experimental loader API and is therefore experimental
4:26:35 AM: YN0007: │ puppeteer@npm:14.2.1 must be built because it never has been before or the last one failed
4:26:35 AM: YN0007: │ puppeteer@npm:13.2.0 must be built because it never has been before or the last one failed
4:26:35 AM: YN0007: │ yarn@npm:1.22.18 must be built because it never has been before or the last one failed
4:26:36 AM: YN0000: │ puppeteer@npm:13.2.0 STDERR
4:26:36 AM: YN0000: │ puppeteer@npm:14.2.1 STDERR
4:26:40 AM: YN0000: │ puppeteer@npm:13.2.0 STDOUT Chromium (961656) downloaded to /opt/repo/packageRemote/.yarn/unplugged/puppeteer-npm-13.2.0-5595d43df8/node_modules/puppeteer/.local-chromium/linux-961656
4:26:40 AM: YN0000: │ puppeteer@npm:14.2.1 STDOUT Chromium (1002410) downloaded to /opt/repo/packageRemote/.yarn/unplugged/puppeteer-npm-14.2.1-e2757bbf6c/node_modules/puppeteer/.local-chromium/linux-1002410
4:26:40 AM: YN0000: └ Completed in 6s 818ms
4:26:40 AM: YN0000: Done with warnings in 13s 452ms
4:26:43 AM: Processing the tree
4:26:45 AM: Running with a level=easy. Settings:
4:26:45 AM: Use cached crunchbase data: true
4:26:45 AM: Use cached images data: true
4:26:45 AM: Use cached twitter data: true
4:26:45 AM: Use cached github basic stats: true
4:26:45 AM: Use cached github start dates: true
4:26:45 AM: Use cached best practices: true
4:26:45 AM: Fetching crunchbase entries
4:26:46 AM: ................................................................................
4:26:46 AM: ................................................................................
4:26:46 AM: ................................................................................
4:26:46 AM: ................................................................................
4:26:46 AM: ................................................................................
4:26:46 AM: ....................................................**
4:26:46 AM: Fetching github entries
4:26:54 AM: ................................................................................
4:26:54 AM: ................................................................................
4:26:54 AM: ..................................*********************.........................
4:26:54 AM: ................................................................................
4:26:54 AM: ...................................................................*********
4:26:54 AM: Fetching start date entries
4:26:57 AM: ................................................................................
4:26:57 AM: ................................................................................
4:26:57 AM: ............................................***********.........................
4:26:57 AM: ................................................................................
4:26:57 AM: .........................................................*******************
4:26:57 AM: Fetching images
4:26:57 AM: got image entries
4:26:57 AM: Hash for Prefect is prefect-2
4:27:04 AM: ................................................................................
4:27:04 AM: ....**......**..................................................................
4:27:04 AM: ................................................................................
4:27:04 AM: ................................................................................
4:27:04 AM: ................................................................................
4:27:04 AM: ................................................................................
4:27:04 AM: Fetching last tweet dates
4:27:04 AM: Fetching best practices
4:27:05 AM: ................................................................................
4:27:05 AM: ................................................................................
4:27:05 AM: ................................................................................
4:27:05 AM: ................................................................................
4:27:05 AM: ...............................................
4:27:05 AM: Fetching CLOMonitor data
4:27:05 AM: Processing the tree
4:27:05 AM: saving!
4:27:07 AM: Hash for Prefect is prefect-2
4:27:08 AM: Warning: Open Platform for Enterprise AI (OPEA) has a twitter https://twitter.com/opeadev which is invalid or we just can not fetch its tweets
4:27:08 AM: Warning: Dragonfly has a twitter https://twitter.com/dragonfly_oss which is invalid or we just can not fetch its tweets
4:27:08 AM: Warning: Vald has a twitter https://twitter.com/vdaas_vald which is invalid or we just can not fetch its tweets
4:27:08 AM: Warning: FastTrackML has a twitter https://twitter.com/oss_gr which is invalid or we just can not fetch its tweets
4:27:08 AM: Warning: Mathesar has a twitter https://twitter.com/centerofci which is invalid or we just can not fetch its tweets
4:27:08 AM: Warning: RWKV has a twitter https://twitter.com/RWKV_AI which is invalid or we just can not fetch its tweets
4:27:08 AM: Warning: Langchain has a twitter https://twitter.com/LangChainAI which is invalid or we just can not fetch its tweets
4:27:08 AM: Warning: Haystack has a twitter https://twitter.com/deepset_ai which is invalid or we just can not fetch its tweets
4:27:08 AM: Warning: Armada has a twitter https://twitter.com/oss_gr which is invalid or we just can not fetch its tweets
4:27:08 AM: Warning: envd has a twitter https://twitter.com/TensorChord which is invalid or we just can not fetch its tweets
4:27:08 AM: Warning: NVIDIA Corporation has a twitter https://twitter.com/nvidia which is invalid or we just can not fetch its tweets
4:27:08 AM: Warning: Advanced Micro Devices (AMD) has a twitter https://twitter.com/AMD which is invalid or we just can not fetch its tweets
4:27:08 AM: Warning: Alibaba Cloud (Singapore) Private LTD has a twitter https://twitter.com/AlibabaB2B which is invalid or we just can not fetch its tweets
4:27:08 AM: Warning: Fujitsu Limited has a twitter https://twitter.com/Fujitsu_Global which is invalid or we just can not fetch its tweets
4:27:08 AM: Warning: GitCode has a twitter https://twitter.com/turingbook which is invalid or we just can not fetch its tweets
4:27:08 AM: Warning: Neo4j, Inc. has a twitter https://twitter.com/neo4j which is invalid or we just can not fetch its tweets
4:27:08 AM: Warning: Posit has a twitter https://twitter.com/posit_pbc which is invalid or we just can not fetch its tweets
4:27:08 AM: Warning: Snyk Limited has a twitter https://twitter.com/snyksec which is invalid or we just can not fetch its tweets
4:27:08 AM: Warning: AI for People has a twitter https://twitter.com/AIforPeople which is invalid or we just can not fetch its tweets
4:27:08 AM: Warning: Ambianic.ai has a twitter https://twitter.com/ambianicai which is invalid or we just can not fetch its tweets
4:27:08 AM: Warning: Columbia University has a twitter https://twitter.com/Columbia which is invalid or we just can not fetch its tweets
4:27:08 AM: Warning: Common Crawl Foundation has a twitter https://twitter.com/commoncrawl which is invalid or we just can not fetch its tweets
4:27:08 AM: Warning: Ersilia Open Source Initiative has a twitter https://twitter.com/ersiliaio which is invalid or we just can not fetch its tweets
4:27:08 AM: Warning: Ewha Institute for Biomedical Law & Ethics has a twitter https://twitter.com/cybercampus which is invalid or we just can not fetch its tweets
4:27:08 AM: Warning: Montreal AI Ethics Institute has a twitter https://twitter.com/mtlaiethics which is invalid or we just can not fetch its tweets
4:27:08 AM: Warning: NIPA has a twitter https://twitter.com/NIPAkr which is invalid or we just can not fetch its tweets
4:27:08 AM: Warning: One Fact Foundation has a twitter https://twitter.com/onefact_org which is invalid or we just can not fetch its tweets
4:27:08 AM: Warning: Pranveer Singh Institute Of Technology has a twitter https://twitter.com/PSITKanpur2004 which is invalid or we just can not fetch its tweets
4:27:08 AM: Warning: Rensselaer Polytechnic Institute has a twitter https://twitter.com/rpi which is invalid or we just can not fetch its tweets
4:27:08 AM: Warning: Transport for NSW has a twitter https://twitter.com/TransportforNSW which is invalid or we just can not fetch its tweets
4:27:08 AM: Warning: University of Michigan has a twitter https://twitter.com/UMich which is invalid or we just can not fetch its tweets
4:27:08 AM: Warning: Wikimedia Deutschland e. V. has a twitter https://twitter.com/WikimediaDE which is invalid or we just can not fetch its tweets
4:27:08 AM: Warning: World Ethical Data Foundation has a twitter https://twitter.com/WEDF_foundation which is invalid or we just can not fetch its tweets
4:27:08 AM: Warning: BasicAI(hosting) has a twitter https://twitter.com/BasicAIteam which is invalid or we just can not fetch its tweets
4:27:08 AM: Warning: Fujitsu Limited(hosting) has a twitter https://twitter.com/Fujitsu_Global which is invalid or we just can not fetch its tweets
4:27:08 AM: Warning: Owkin(hosting) has a twitter https://twitter.com/owkinscience which is invalid or we just can not fetch its tweets
4:27:08 AM: Fetching members from LF AI & Data Member Company category
4:27:08 AM: Processing the tree
4:27:09 AM: Assigning Premier membership on DataHub (LinkedIn) because its parent Microsoft has Premier membership
4:27:09 AM: Assigning Premier membership on Brooklin (LinkedIn) because its parent Microsoft has Premier membership
4:27:09 AM: Assigning Premier membership on Azkaban (LinkedIn) because its parent Microsoft has Premier membership
4:27:09 AM: Assigning Premier membership on LinkedIn (hosting) (LinkedIn) because its parent Microsoft has Premier membership
4:27:09 AM: Hash for Chaitanya Bharathi Institute of Technology is chaitanya-bharathi-institute-of-technology-2
4:27:09 AM: Hash for Fast.ai is fast-ai-2
4:27:09 AM: Hash for Great Expectations is great-expectations-2
4:27:09 AM: Hash for ML Perf is ml-perf-2
4:27:09 AM: Hash for PipelineAI is pipeline-ai-2
4:27:09 AM: Hash for Prefect is prefect-2
4:27:09 AM: Hash for Redash is redash-2
4:27:09 AM: Hash for Wikimedia Deutschland e.V. is wikimedia-deutschland-e-v-2
4:27:12 AM: {
4:27:12 AM: name: 'Accord.NET',
4:27:12 AM: homepage_url: 'http://accord-framework.net/',
4:27:12 AM: logo: 'accord-net.svg',
4:27:12 AM: github_data: {
4:27:12 AM: languages: [
4:27:12 AM: [Object], [Object],
4:27:12 AM: [Object], [Object],
4:27:12 AM: [Object], [Object],
4:27:12 AM: [Object], [Object],
4:27:12 AM: [Object], [Object],
4:27:12 AM: [Object], [Object],
4:27:12 AM: [Object], [Object],
4:27:12 AM: [Object]
4:27:12 AM: ],
4:27:12 AM: contributions: '0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0',
4:27:12 AM: firstWeek: '2022-11-27Z',
4:27:12 AM: stars: 4404,
4:27:12 AM: license: 'GNU Lesser General Public License v2.1',
4:27:12 AM: description: 'Machine learning, computer vision, statistics and general scientific computing for .NET',
4:27:12 AM: latest_commit_date: '2020-11-18T19:53:01Z',
4:27:12 AM: latest_commit_link: '/accord-net/framework/commit/1ab0cc0ba55bcc3d46f20e7bbe7224b58cd01854',
4:27:12 AM: release_date: '2017-10-19T21:00:56Z',
4:27:12 AM: contributors_count: 98,
4:27:12 AM: },
4:27:12 AM: repos: [ { url: 'https://github.com/accord-net/framework', stars: 4404 } ],
4:27:12 AM: github_start_commit_data: {
4:27:12 AM: start_commit_link: '/accord-net/framework/commit/5e834efc89c071bb32c84f41124d5c9a5bbfe1e3',
4:27:12 AM: start_date: '2012-04-08T14:05:58Z'
4:27:12 AM: },
4:27:12 AM: image_data: {
4:27:12 AM: fileName: 'accord-net.svg',
4:27:12 AM: hash: 'tdvFWfo6hfZQqHqjscAtcoZWk6qztZFzBTVqKWUjYig='
4:27:12 AM: },
4:27:12 AM: firstCommitDate: '2012-04-08T14:05:58Z',
4:27:12 AM: latestCommitDate: '2020-11-18T19:53:01Z',
4:27:12 AM: releaseDate: '2017-10-19T21:00:56Z',
4:27:12 AM: commitsThisYear: 0,
4:27:12 AM: contributorsCount: 98,
4:27:12 AM: language: 'C#',
4:27:12 AM: stars: 4404,
4:27:12 AM: license: 'GNU Lesser General Public License v2.1',
4:27:12 AM: headquarters: 'Grenoble, France',
4:27:12 AM: description: 'Machine learning, computer vision, statistics and general scientific computing for .NET',
4:27:12 AM: organization: 'Accord.NET Framework',
4:27:12 AM: crunchbaseData: {
4:27:12 AM: name: 'Accord.NET Framework',
4:27:12 AM: description: 'Machine Learning Framework',
4:27:12 AM: homepage: 'http://accord-framework.net/',
4:27:12 AM: city: 'Grenoble',
4:27:12 AM: region: 'Rhone-Alpes',
4:27:12 AM: country: 'France',
4:27:12 AM: twitter: null,
4:27:12 AM: linkedin: null,
4:27:12 AM: acquisitions: [],
4:27:12 AM: parents: [],
4:27:12 AM: stockExchange: null,
4:27:12 AM: company_type: 'Non Profit',
4:27:12 AM: industries: [
4:27:12 AM: 'Analytics',
4:27:12 AM: 'Artificial Intelligence',
4:27:12 AM: 'Hardware',
4:27:12 AM: 'Machine Learning'
4:27:12 AM: ],
4:27:12 AM: numEmployeesMin: null,
4:27:12 AM: numEmployeesMax: null
4:27:12 AM: },
4:27:12 AM: path: 'Machine Learning / Framework',
4:27:12 AM: landscape: 'Machine Learning / Framework',
4:27:12 AM: category: 'Machine Learning',
4:27:12 AM: amount: 'N/A',
4:27:12 AM: oss: true,
4:27:12 AM: href: 'logos/accord-net.svg',
4:27:12 AM: bestPracticeBadgeId: false,
4:27:12 AM: bestPracticePercentage: null,
4:27:12 AM: industries: [
4:27:12 AM: 'Analytics',
4:27:12 AM: 'Artificial Intelligence',
4:27:12 AM: 'Hardware',
4:27:12 AM: 'Machine Learning'
4:27:12 AM: ],
4:27:12 AM: starsPresent: true,
4:27:12 AM: starsAsText: '4,404',
4:27:12 AM: marketCapPresent: false,
4:27:12 AM: marketCapAsText: 'N/A',
4:27:12 AM: id: 'accord-net',
4:27:12 AM: flatName: 'Accord.NET',
4:27:12 AM: member: false,
4:27:12 AM: relation: false,
4:27:12 AM: isSubsidiaryProject: false
4:27:12 AM: } 2020-11-18T19:53:01Z
4:27:12 AM: [
4:27:12 AM: 'Community Data License Agreement (CDLA)',
4:27:12 AM: 'PlaNet',
4:27:12 AM: 'Generic Neural Elastic Search (GNES)',
4:27:12 AM: 'PredictionIO',
4:27:12 AM: 'ELI5',
4:27:12 AM: 'BERT',
4:27:12 AM: 'Nauta',
4:27:12 AM: 'DAWNBench',
4:27:12 AM: 'AresDB',
4:27:12 AM: 'dotmesh',
4:27:12 AM: 'Audit AI',
4:27:12 AM: 'euler',
4:27:12 AM: 'Clipper',
4:27:12 AM: 'Accord.NET',
4:27:12 AM: 'Shogun',
4:27:12 AM: 'DELTA',
4:27:12 AM: 'BeakerX',
4:27:12 AM: 'PixieDust',
4:27:12 AM: 'TreeInterpreter',
4:27:12 AM: 'Cyclone',
4:27:12 AM: 'Lucid',
4:27:12 AM: 'XLM',
4:27:12 AM: 'Chainer RL',
4:27:12 AM: 'ForestFlow',
4:27:12 AM: 'uReplicator',
4:27:12 AM: 'Elastic Deep Learning (EDL)',
4:27:12 AM: 'Kashgari',
4:27:12 AM: 'X-DeepLearning',
4:27:12 AM: 'LIME',
4:27:12 AM: 'Model Asset eXchange (MAX)',
4:27:12 AM: 'TransmogrifAI',
4:27:12 AM: 'OpenBytes',
4:27:12 AM: 'DeepLIFT',
4:27:12 AM: 'Onepanel',
4:27:12 AM: 'DeepSpeech',
4:27:12 AM: 'Lucene',
4:27:12 AM: 'Turi Create',
4:27:12 AM: 'Visual Object Tagging Tool (VoTT)',
4:27:12 AM: 'Acumos',
4:27:12 AM: 'Skater',
4:27:12 AM: 'Catalyst',
4:27:12 AM: 'SKIP Language',
4:27:12 AM: 'SQLFlow',
4:27:12 AM: 'Advertorch',
4:27:12 AM: 'xLearn',
4:27:12 AM: 'Neuropod',
4:27:12 AM: 'AdvBox',
4:27:12 AM: 'RCloud',
4:27:12 AM: 'Neo-AI',
4:27:12 AM: 'Embedded Learning Library',
4:27:12 AM: 'Stable Baselines',
4:27:12 AM: 'talos',
4:27:12 AM: 'LabelImg',
4:27:12 AM: 'MMdnn',
4:27:12 AM: 'CNTK',
4:27:12 AM: 'Machine Learning eXchange',
4:27:12 AM: 'Singularity',
4:27:12 AM: 'Chainer',
4:27:12 AM: 'PyText',
4:27:12 AM: 'Pipeline.ai',
4:27:12 AM: 'Apache Bahir',
4:27:12 AM: 'NLP Architect',
4:27:12 AM: 'AllenNLP',
4:27:12 AM: 'Angel-ML',
4:27:12 AM: 'SEED RL',
4:27:12 AM: 'Coach',
4:27:12 AM: 'Gluon-NLP',
4:27:12 AM: 'DeepMind Lab',
4:27:12 AM: 'SEAL',
4:27:12 AM: 'MXNet',
4:27:12 AM: 'OpenAI Gym',
4:27:12 AM: 'MindMeld',
4:27:12 AM: 'CleverHans',
4:27:12 AM: 'Petastorm',
4:27:12 AM: 'Hawq',
4:27:12 AM: 'TF Encrypted',
4:27:12 AM: 'faust',
4:27:12 AM: 'Cortex',
4:27:12 AM: 'OpenDataology',
4:27:12 AM: 'YouTokenToMe',
4:27:12 AM: 'ALBERT',
4:27:12 AM: 'Adlik',
4:27:12 AM: '1chipML',
4:27:12 AM: 'Neural Network Distiller',
4:27:12 AM: 'Labelbox',
4:27:12 AM: 'Facets',
4:27:12 AM: 'OpenNN',
4:27:12 AM: 'Pilosa',
4:27:12 AM: 'Orchest',
4:27:12 AM: 'Model Server for Apache MXNet',
4:27:12 AM: 'LASER',
4:27:12 AM: 'Dopamine',
4:27:12 AM: 'MindSpore',
4:27:12 AM: 'HE Lib',
4:27:12 AM: 'd6tflow',
4:27:12 AM: 'Sonnet',
4:27:12 AM: 'Plaid ML',
4:27:12 AM: 'Nyoka',
4:27:12 AM: 'doccano',
4:27:12 AM: 'ecco',
4:27:12 AM: ... 252 more items
4:27:12 AM: ]
4:27:16 AM: ncc: Version 0.34.0
4:27:16 AM: ncc: Compiling file index.js into CJS
4:27:17 AM: ncc: Version 0.34.0
4:27:17 AM: ncc: Compiling file index.js into CJS
4:27:18 AM: ncc: Version 0.34.0
4:27:18 AM: ncc: Compiling file index.js into CJS
4:27:22 AM: Development server running at http://127.0.0.1:4000/
4:27:32 AM: Running integration-test,check-landscape,render-landscape,funding in parallel
4:27:34 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:27:36 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:27:37 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:27:38 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:27:39 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:27:40 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:27:41 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:27:42 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:27:42 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:27:43 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:27:43 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:27:44 AM: api request starting... /api/ids?category=&project=&license=&organization=accord-net-framework&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:27:44 AM: api request starting... /api/ids?category=&project=&license=&organization=microsoft&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:27:54 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:28:15 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:28:35 AM: Task: integration-test PASS specs/main.spec.js (9.624s)
4:28:35 AM: Main test
4:28:35 AM: I visit a main page and have all required elements
4:28:35 AM: ✓ I can open a page (1689ms)
4:28:35 AM: ✓ A proper header is present (6ms)
4:28:35 AM: ✓ Group headers are ok (3ms)
4:28:35 AM: ✓ I see a You are viewing text (2ms)
4:28:35 AM: ✓ A proper card is present (4ms)
4:28:35 AM: ✓ If I click on a card, I see a modal dialog (334ms)
4:28:35 AM: ✓ Closing a browser (25ms)
4:28:35 AM: Landscape Test
4:28:35 AM: ✓ I visit LF AI & Data Members landscape page and have all required elements, elements are clickable (908ms)
4:28:35 AM: ✓ Closing a browser (28ms)
4:28:35 AM: ✓ I visit Companies Hosting Projects landscape page and have all required elements, elements are clickable (878ms)
4:28:35 AM: ✓ Closing a browser (22ms)
4:28:35 AM: I visit a main landscape page and have all required elements
4:28:35 AM: ✓ I open a landscape page and wait for it to load (1848ms)
4:28:35 AM: ✓ When I click on an item the modal is open (90ms)
4:28:35 AM: ✓ If I would straight open the url with a selected id, a modal appears (1865ms)
4:28:35 AM: ✓ Closing a browser (31ms)
4:28:35 AM: Filtering by organization
4:28:35 AM: ✓ Checking we see Accord.NET when filtering by organization Accord.NET Framework (577ms)
4:28:35 AM: ✓ Checking we don't see Accord.NET when filtering by organization Microsoft (239ms)
✓ Closing a browser (23ms)
4:28:35 AM: PASS specs/tools/actualTwitter.spec.js
4:28:35 AM: Twitter URL
4:28:35 AM: when crunchbase data not set
4:28:35 AM: ✓ returns URL from node (2ms)
4:28:35 AM: when node does not have twitter URL
4:28:35 AM: ✓ returns URL from node (1ms)
4:28:35 AM: when node has twitter URL set to null
4:28:35 AM: ✓ returns undefined
4:28:35 AM: when both node and crunchbase have twitter URL
4:28:35 AM: ✓ returns URL from node (1ms)
4:28:35 AM: when twitter URL is not set anywhere
4:28:35 AM: ✓ returns undefined
4:28:35 AM: cleaning up twitter URL
4:28:35 AM: ✓ replaces http with https (1ms)
4:28:35 AM: ✓ removes www
4:28:35 AM: ✓ query string
4:28:35 AM: Test Suites: 2 passed, 2 total
4:28:35 AM: Tests: 26 passed, 26 total
4:28:35 AM: Snapshots: 0 total
4:28:35 AM: Time: 9.889s
4:28:35 AM: Task: check-landscape
4:28:35 AM: Task: render-landscape visiting http://localhost:4000/fullscreen?version=2025-02-13T04:27:33Z 60a7223&scale=false&pdf
4:28:35 AM: visiting http://localhost:4000/fullscreen?version=2025-02-13T04:27:33Z 60a7223&scale=false&pdf
4:28:35 AM: Task: funding From https://github.com/lfai/lfai-landscape
4:28:35 AM: * [new branch] NSouthernLF-patch-1 -> github/NSouthernLF-patch-1
4:28:35 AM: * [new branch] NSouthernLF-patch-2 -> github/NSouthernLF-patch-2
4:28:35 AM: * [new branch] NSouthernLF-patch-3 -> github/NSouthernLF-patch-3
4:28:35 AM: * [new branch] NSouthernLF-patch-3-1 -> github/NSouthernLF-patch-3-1
4:28:35 AM: * [new branch] create-pull-request/patch-1693628094 -> github/create-pull-request/patch-1693628094
4:28:35 AM: * [new branch] create-pull-request/patch-1693714471 -> github/create-pull-request/patch-1693714471
4:28:35 AM: * [new branch] create-pull-request/patch-1703668428 -> github/create-pull-request/patch-1703668428
4:28:35 AM: * [new branch] create-pull-request/patch-1739420663 -> github/create-pull-request/patch-1739420663
4:28:35 AM: * [new branch] main -> github/main
4:28:35 AM: * [new branch] revert-303-main -> github/revert-303-main
4:28:35 AM: * [new branch] web-landscape-2021-10-01T22-18 -> github/web-landscape-2021-10-01T22-18
4:28:35 AM: * [new branch] web-landscape-2021-10-12T16-06 -> github/web-landscape-2021-10-12T16-06
4:28:35 AM: * [new branch] web-landscape-2021-10-18T19-46 -> github/web-landscape-2021-10-18T19-46
4:28:35 AM: * [new branch] web-landscape-2021-10-20T18-58 -> github/web-landscape-2021-10-20T18-58
4:28:38 AM: Output from remote build, exit code: 0
4:28:38 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:28:38 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:28:38 AM: * Documentation: https://help.ubuntu.com
4:28:38 AM: * Management: https://landscape.canonical.com
4:28:38 AM: * Support: https://ubuntu.com/advantage
4:28:38 AM: System information as of Thu Feb 13 04:26:03 UTC 2025
4:28:38 AM: System load: 1.69
4:28:38 AM: Usage of /: 75.8% of 217.51GB
4:28:38 AM: Memory usage: 20%
4:28:38 AM: Swap usage: 1%
4:28:38 AM: Processes: 641
4:28:38 AM: Users logged in: 1
4:28:38 AM: IPv4 address for bond0: 147.75.199.15
4:28:38 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:28:38 AM: IPv4 address for docker0: 172.17.0.1
4:28:38 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:28:38 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:28:38 AM: 82 updates can be applied immediately.
4:28:38 AM: 7 of these updates are standard security updates.
4:28:38 AM: To see these additional updates run: apt list --upgradable
4:28:38 AM: New release '22.04.5 LTS' available.
4:28:38 AM: Run 'do-release-upgrade' to upgrade to it.
4:28:38 AM: 2 updates could not be installed automatically. For more details,
4:28:38 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:28:38 AM: *** System restart required ***
4:28:38 AM: /opt/buildhome/.nvm/nvm.sh
4:28:38 AM: .:
4:28:38 AM: ADOPTERS.md jest.config.js netlify.toml tools
4:28:38 AM: bin landscapes_dev package.json update_server
4:28:38 AM: build.sh landscapes.sh postcss.config.js _yarn.lock
4:28:38 AM: code-of-conduct.md landscapes.yml README.md yarn.lock
4:28:38 AM: files LICENSE server.js
4:28:38 AM: _headers netlify specs
4:28:38 AM: INSTALL.md netlify.md src
4:28:38 AM: v18.3
4:28:38 AM: Downloading and installing node v18.3.0...
4:28:38 AM: Computing checksum with sha256sum
4:28:38 AM: Checksums matched!
4:28:38 AM: Now using node v18.3.0 (npm v8.11.0)
4:28:38 AM: Now using node v18.3.0 (npm v8.11.0)
4:28:38 AM: npm WARN config global `--global`, `--local` are deprecated. Use `--location=global` instead.
4:28:38 AM:
4:28:38 AM: added 3 packages, and audited 4 packages in 510ms
4:28:38 AM: found 0 vulnerabilities
4:28:38 AM: npm WARN config global `--global`, `--local` are deprecated. Use `--location=global` instead.
4:28:38 AM:
4:28:38 AM: removed 12 packages, changed 105 packages, and audited 263 packages in 2s
4:28:38 AM: 27 packages are looking for funding
4:28:38 AM: run `npm fund` for details
4:28:38 AM: found 0 vulnerabilities
4:28:38 AM: added 1 package in 1s
4:28:38 AM: YN0000: ┌ Resolution step
4:28:38 AM: YN0000: └ Completed
4:28:38 AM: YN0000: ┌ Fetch step
4:28:38 AM: YN0013: │ 2 packages were already cached, 808 had to be fetched
4:28:38 AM: YN0000: └ Completed in 6s 214ms
4:28:38 AM: YN0000: ┌ Link step
4:28:38 AM: YN0000: │ ESM support for PnP uses the experimental loader API and is therefore experimental
4:28:38 AM: YN0007: │ puppeteer@npm:14.2.1 must be built because it never has been before or the last one failed
4:28:38 AM: YN0007: │ puppeteer@npm:13.2.0 must be built because it never has been before or the last one failed
4:28:38 AM: YN0007: │ yarn@npm:1.22.18 must be built because it never has been before or the last one failed
4:28:38 AM: YN0000: │ puppeteer@npm:13.2.0 STDERR
4:28:38 AM: YN0000: │ puppeteer@npm:14.2.1 STDERR
4:28:38 AM: YN0000: │ puppeteer@npm:13.2.0 STDOUT Chromium (961656) downloaded to /opt/repo/packageRemote/.yarn/unplugged/puppeteer-npm-13.2.0-5595d43df8/node_modules/puppeteer/.local-chromium/linux-961656
4:28:38 AM: YN0000: │ puppeteer@npm:14.2.1 STDOUT Chromium (1002410) downloaded to /opt/repo/packageRemote/.yarn/unplugged/puppeteer-npm-14.2.1-e2757bbf6c/node_modules/puppeteer/.local-chromium/linux-1002410
4:28:38 AM: YN0000: └ Completed in 6s 818ms
4:28:38 AM: YN0000: Done with warnings in 13s 452ms
4:28:38 AM: Processing the tree
4:28:38 AM: Running with a level=easy. Settings:
4:28:38 AM: Use cached crunchbase data: true
4:28:38 AM: Use cached images data: true
4:28:38 AM: Use cached twitter data: true
4:28:38 AM: Use cached github basic stats: true
4:28:38 AM: Use cached github start dates: true
4:28:38 AM: Use cached best practices: true
4:28:38 AM: Fetching crunchbase entries
4:28:38 AM: ................................................................................
4:28:38 AM: ................................................................................
4:28:38 AM: ................................................................................
4:28:38 AM: ................................................................................
4:28:38 AM: ................................................................................
4:28:38 AM: ....................................................**
4:28:38 AM: Fetching github entries
4:28:38 AM: ................................................................................
4:28:38 AM: ................................................................................
4:28:38 AM: ..................................*********************.........................
4:28:38 AM: ................................................................................
4:28:38 AM: ...................................................................*********
4:28:38 AM: Fetching start date entries
4:28:38 AM: ................................................................................
4:28:38 AM: ................................................................................
4:28:38 AM: ............................................***********.........................
4:28:38 AM: ................................................................................
4:28:38 AM: .........................................................*******************
4:28:38 AM: Fetching images
4:28:38 AM: got image entries
4:28:38 AM: Hash for Prefect is prefect-2
4:28:38 AM: ................................................................................
4:28:38 AM: ....**......**..................................................................
4:28:38 AM: ................................................................................
4:28:38 AM: ................................................................................
4:28:38 AM: ................................................................................
4:28:38 AM: ................................................................................
4:28:38 AM: Fetching last tweet dates
4:28:38 AM: Fetching best practices
4:28:38 AM: ................................................................................
4:28:38 AM: ................................................................................
4:28:38 AM: ................................................................................
4:28:38 AM: ................................................................................
4:28:38 AM: ...............................................
4:28:38 AM: Fetching CLOMonitor data
4:28:38 AM: Processing the tree
4:28:38 AM: saving!
4:28:38 AM: Hash for Prefect is prefect-2
4:28:38 AM: Warning: Open Platform for Enterprise AI (OPEA) has a twitter https://twitter.com/opeadev which is invalid or we just can not fetch its tweets
4:28:38 AM: Warning: Dragonfly has a twitter https://twitter.com/dragonfly_oss which is invalid or we just can not fetch its tweets
4:28:38 AM: Warning: Vald has a twitter https://twitter.com/vdaas_vald which is invalid or we just can not fetch its tweets
4:28:38 AM: Warning: FastTrackML has a twitter https://twitter.com/oss_gr which is invalid or we just can not fetch its tweets
4:28:38 AM: Warning: Mathesar has a twitter https://twitter.com/centerofci which is invalid or we just can not fetch its tweets
4:28:38 AM: Warning: RWKV has a twitter https://twitter.com/RWKV_AI which is invalid or we just can not fetch its tweets
4:28:38 AM: Warning: Langchain has a twitter https://twitter.com/LangChainAI which is invalid or we just can not fetch its tweets
4:28:38 AM: Warning: Haystack has a twitter https://twitter.com/deepset_ai which is invalid or we just can not fetch its tweets
4:28:38 AM: Warning: Armada has a twitter https://twitter.com/oss_gr which is invalid or we just can not fetch its tweets
4:28:38 AM: Warning: envd has a twitter https://twitter.com/TensorChord which is invalid or we just can not fetch its tweets
4:28:38 AM: Warning: NVIDIA Corporation has a twitter https://twitter.com/nvidia which is invalid or we just can not fetch its tweets
4:28:38 AM: Warning: Advanced Micro Devices (AMD) has a twitter https://twitter.com/AMD which is invalid or we just can not fetch its tweets
4:28:38 AM: Warning: Alibaba Cloud (Singapore) Private LTD has a twitter https://twitter.com/AlibabaB2B which is invalid or we just can not fetch its tweets
4:28:38 AM: Warning: Fujitsu Limited has a twitter https://twitter.com/Fujitsu_Global which is invalid or we just can not fetch its tweets
4:28:38 AM: Warning: GitCode has a twitter https://twitter.com/turingbook which is invalid or we just can not fetch its tweets
4:28:38 AM: Warning: Neo4j, Inc. has a twitter https://twitter.com/neo4j which is invalid or we just can not fetch its tweets
4:28:38 AM: Warning: Posit has a twitter https://twitter.com/posit_pbc which is invalid or we just can not fetch its tweets
4:28:38 AM: Warning: Snyk Limited has a twitter https://twitter.com/snyksec which is invalid or we just can not fetch its tweets
4:28:38 AM: Warning: AI for People has a twitter https://twitter.com/AIforPeople which is invalid or we just can not fetch its tweets
4:28:38 AM: Warning: Ambianic.ai has a twitter https://twitter.com/ambianicai which is invalid or we just can not fetch its tweets
4:28:38 AM: Warning: Columbia University has a twitter https://twitter.com/Columbia which is invalid or we just can not fetch its tweets
4:28:38 AM: Warning: Common Crawl Foundation has a twitter https://twitter.com/commoncrawl which is invalid or we just can not fetch its tweets
4:28:38 AM: Warning: Ersilia Open Source Initiative has a twitter https://twitter.com/ersiliaio which is invalid or we just can not fetch its tweets
4:28:38 AM: Warning: Ewha Institute for Biomedical Law & Ethics has a twitter https://twitter.com/cybercampus which is invalid or we just can not fetch its tweets
4:28:38 AM: Warning: Montreal AI Ethics Institute has a twitter https://twitter.com/mtlaiethics which is invalid or we just can not fetch its tweets
4:28:38 AM: Warning: NIPA has a twitter https://twitter.com/NIPAkr which is invalid or we just can not fetch its tweets
4:28:38 AM: Warning: One Fact Foundation has a twitter https://twitter.com/onefact_org which is invalid or we just can not fetch its tweets
4:28:38 AM: Warning: Pranveer Singh Institute Of Technology has a twitter https://twitter.com/PSITKanpur2004 which is invalid or we just can not fetch its tweets
4:28:38 AM: Warning: Rensselaer Polytechnic Institute has a twitter https://twitter.com/rpi which is invalid or we just can not fetch its tweets
4:28:38 AM: Warning: Transport for NSW has a twitter https://twitter.com/TransportforNSW which is invalid or we just can not fetch its tweets
4:28:38 AM: Warning: University of Michigan has a twitter https://twitter.com/UMich which is invalid or we just can not fetch its tweets
4:28:38 AM: Warning: Wikimedia Deutschland e. V. has a twitter https://twitter.com/WikimediaDE which is invalid or we just can not fetch its tweets
4:28:38 AM: Warning: World Ethical Data Foundation has a twitter https://twitter.com/WEDF_foundation which is invalid or we just can not fetch its tweets
4:28:38 AM: Warning: BasicAI(hosting) has a twitter https://twitter.com/BasicAIteam which is invalid or we just can not fetch its tweets
4:28:38 AM: Warning: Fujitsu Limited(hosting) has a twitter https://twitter.com/Fujitsu_Global which is invalid or we just can not fetch its tweets
4:28:38 AM: Warning: Owkin(hosting) has a twitter https://twitter.com/owkinscience which is invalid or we just can not fetch its tweets
4:28:38 AM: Fetching members from LF AI & Data Member Company category
4:28:38 AM: Processing the tree
4:28:38 AM: Assigning Premier membership on DataHub (LinkedIn) because its parent Microsoft has Premier membership
4:28:38 AM: Assigning Premier membership on Brooklin (LinkedIn) because its parent Microsoft has Premier membership
4:28:38 AM: Assigning Premier membership on Azkaban (LinkedIn) because its parent Microsoft has Premier membership
4:28:38 AM: Assigning Premier membership on LinkedIn (hosting) (LinkedIn) because its parent Microsoft has Premier membership
4:28:38 AM: Hash for Chaitanya Bharathi Institute of Technology is chaitanya-bharathi-institute-of-technology-2
4:28:38 AM: Hash for Fast.ai is fast-ai-2
4:28:38 AM: Hash for Great Expectations is great-expectations-2
4:28:38 AM: Hash for ML Perf is ml-perf-2
4:28:38 AM: Hash for PipelineAI is pipeline-ai-2
4:28:38 AM: Hash for Prefect is prefect-2
4:28:38 AM: Hash for Redash is redash-2
4:28:38 AM: Hash for Wikimedia Deutschland e.V. is wikimedia-deutschland-e-v-2
4:28:38 AM: {
4:28:38 AM: name: 'Accord.NET',
4:28:38 AM: homepage_url: 'http://accord-framework.net/',
4:28:38 AM: logo: 'accord-net.svg',
4:28:38 AM: github_data: {
4:28:38 AM: languages: [
4:28:38 AM: [Object], [Object],
4:28:38 AM: [Object], [Object],
4:28:38 AM: [Object], [Object],
4:28:38 AM: [Object], [Object],
4:28:38 AM: [Object], [Object],
4:28:38 AM: [Object], [Object],
4:28:38 AM: [Object], [Object],
4:28:38 AM: [Object]
4:28:38 AM: ],
4:28:38 AM: contributions: '0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0',
4:28:38 AM: firstWeek: '2022-11-27Z',
4:28:38 AM: stars: 4404,
4:28:38 AM: license: 'GNU Lesser General Public License v2.1',
4:28:38 AM: description: 'Machine learning, computer vision, statistics and general scientific computing for .NET',
4:28:38 AM: latest_commit_date: '2020-11-18T19:53:01Z',
4:28:38 AM: latest_commit_link: '/accord-net/framework/commit/1ab0cc0ba55bcc3d46f20e7bbe7224b58cd01854',
4:28:38 AM: release_date: '2017-10-19T21:00:56Z',
4:28:38 AM: contributors_count: 98,
4:28:38 AM: },
4:28:38 AM: repos: [ { url: 'https://github.com/accord-net/framework', stars: 4404 } ],
4:28:38 AM: github_start_commit_data: {
4:28:38 AM: start_commit_link: '/accord-net/framework/commit/5e834efc89c071bb32c84f41124d5c9a5bbfe1e3',
4:28:38 AM: start_date: '2012-04-08T14:05:58Z'
4:28:38 AM: },
4:28:38 AM: image_data: {
4:28:38 AM: fileName: 'accord-net.svg',
4:28:38 AM: hash: 'tdvFWfo6hfZQqHqjscAtcoZWk6qztZFzBTVqKWUjYig='
4:28:38 AM: },
4:28:38 AM: firstCommitDate: '2012-04-08T14:05:58Z',
4:28:38 AM: latestCommitDate: '2020-11-18T19:53:01Z',
4:28:38 AM: releaseDate: '2017-10-19T21:00:56Z',
4:28:38 AM: commitsThisYear: 0,
4:28:38 AM: contributorsCount: 98,
4:28:38 AM: language: 'C#',
4:28:38 AM: stars: 4404,
4:28:38 AM: license: 'GNU Lesser General Public License v2.1',
4:28:38 AM: headquarters: 'Grenoble, France',
4:28:38 AM: description: 'Machine learning, computer vision, statistics and general scientific computing for .NET',
4:28:38 AM: organization: 'Accord.NET Framework',
4:28:38 AM: crunchbaseData: {
4:28:38 AM: name: 'Accord.NET Framework',
4:28:38 AM: description: 'Machine Learning Framework',
4:28:38 AM: homepage: 'http://accord-framework.net/',
4:28:38 AM: city: 'Grenoble',
4:28:38 AM: region: 'Rhone-Alpes',
4:28:38 AM: country: 'France',
4:28:38 AM: twitter: null,
4:28:38 AM: linkedin: null,
4:28:38 AM: acquisitions: [],
4:28:38 AM: parents: [],
4:28:38 AM: stockExchange: null,
4:28:38 AM: company_type: 'Non Profit',
4:28:38 AM: industries: [
4:28:38 AM: 'Analytics',
4:28:38 AM: 'Artificial Intelligence',
4:28:38 AM: 'Hardware',
4:28:38 AM: 'Machine Learning'
4:28:38 AM: ],
4:28:38 AM: numEmployeesMin: null,
4:28:38 AM: numEmployeesMax: null
4:28:38 AM: },
4:28:38 AM: path: 'Machine Learning / Framework',
4:28:38 AM: landscape: 'Machine Learning / Framework',
4:28:38 AM: category: 'Machine Learning',
4:28:38 AM: amount: 'N/A',
4:28:38 AM: oss: true,
4:28:38 AM: href: 'logos/accord-net.svg',
4:28:38 AM: bestPracticeBadgeId: false,
4:28:38 AM: bestPracticePercentage: null,
4:28:38 AM: industries: [
4:28:38 AM: 'Analytics',
4:28:38 AM: 'Artificial Intelligence',
4:28:38 AM: 'Hardware',
4:28:38 AM: 'Machine Learning'
4:28:38 AM: ],
4:28:38 AM: starsPresent: true,
4:28:38 AM: starsAsText: '4,404',
4:28:38 AM: marketCapPresent: false,
4:28:38 AM: marketCapAsText: 'N/A',
4:28:38 AM: id: 'accord-net',
4:28:38 AM: flatName: 'Accord.NET',
4:28:38 AM: member: false,
4:28:38 AM: relation: false,
4:28:38 AM: isSubsidiaryProject: false
4:28:38 AM: } 2020-11-18T19:53:01Z
4:28:38 AM: [
4:28:38 AM: 'Community Data License Agreement (CDLA)',
4:28:38 AM: 'PlaNet',
4:28:38 AM: 'Generic Neural Elastic Search (GNES)',
4:28:38 AM: 'PredictionIO',
4:28:38 AM: 'ELI5',
4:28:38 AM: 'BERT',
4:28:38 AM: 'Nauta',
4:28:38 AM: 'DAWNBench',
4:28:38 AM: 'AresDB',
4:28:38 AM: 'dotmesh',
4:28:38 AM: 'Audit AI',
4:28:38 AM: 'euler',
4:28:38 AM: 'Clipper',
4:28:38 AM: 'Accord.NET',
4:28:38 AM: 'Shogun',
4:28:38 AM: 'DELTA',
4:28:38 AM: 'BeakerX',
4:28:38 AM: 'PixieDust',
4:28:38 AM: 'TreeInterpreter',
4:28:38 AM: 'Cyclone',
4:28:38 AM: 'Lucid',
4:28:38 AM: 'XLM',
4:28:38 AM: 'Chainer RL',
4:28:38 AM: 'ForestFlow',
4:28:38 AM: 'uReplicator',
4:28:38 AM: 'Elastic Deep Learning (EDL)',
4:28:38 AM: 'Kashgari',
4:28:38 AM: 'X-DeepLearning',
4:28:38 AM: 'LIME',
4:28:38 AM: 'Model Asset eXchange (MAX)',
4:28:38 AM: 'TransmogrifAI',
4:28:38 AM: 'OpenBytes',
4:28:38 AM: 'DeepLIFT',
4:28:38 AM: 'Onepanel',
4:28:38 AM: 'DeepSpeech',
4:28:38 AM: 'Lucene',
4:28:38 AM: 'Turi Create',
4:28:38 AM: 'Visual Object Tagging Tool (VoTT)',
4:28:38 AM: 'Acumos',
4:28:38 AM: 'Skater',
4:28:38 AM: 'Catalyst',
4:28:38 AM: 'SKIP Language',
4:28:38 AM: 'SQLFlow',
4:28:38 AM: 'Advertorch',
4:28:38 AM: 'xLearn',
4:28:38 AM: 'Neuropod',
4:28:38 AM: 'AdvBox',
4:28:38 AM: 'RCloud',
4:28:38 AM: 'Neo-AI',
4:28:38 AM: 'Embedded Learning Library',
4:28:38 AM: 'Stable Baselines',
4:28:38 AM: 'talos',
4:28:38 AM: 'LabelImg',
4:28:38 AM: 'MMdnn',
4:28:38 AM: 'CNTK',
4:28:38 AM: 'Machine Learning eXchange',
4:28:38 AM: 'Singularity',
4:28:38 AM: 'Chainer',
4:28:38 AM: 'PyText',
4:28:38 AM: 'Pipeline.ai',
4:28:38 AM: 'Apache Bahir',
4:28:38 AM: 'NLP Architect',
4:28:38 AM: 'AllenNLP',
4:28:38 AM: 'Angel-ML',
4:28:38 AM: 'SEED RL',
4:28:38 AM: 'Coach',
4:28:38 AM: 'Gluon-NLP',
4:28:38 AM: 'DeepMind Lab',
4:28:38 AM: 'SEAL',
4:28:38 AM: 'MXNet',
4:28:38 AM: 'OpenAI Gym',
4:28:38 AM: 'MindMeld',
4:28:38 AM: 'CleverHans',
4:28:38 AM: 'Petastorm',
4:28:38 AM: 'Hawq',
4:28:38 AM: 'TF Encrypted',
4:28:38 AM: 'faust',
4:28:38 AM: 'Cortex',
4:28:38 AM: 'OpenDataology',
4:28:38 AM: 'YouTokenToMe',
4:28:38 AM: 'ALBERT',
4:28:38 AM: 'Adlik',
4:28:38 AM: '1chipML',
4:28:38 AM: 'Neural Network Distiller',
4:28:38 AM: 'Labelbox',
4:28:38 AM: 'Facets',
4:28:38 AM: 'OpenNN',
4:28:38 AM: 'Pilosa',
4:28:38 AM: 'Orchest',
4:28:38 AM: 'Model Server for Apache MXNet',
4:28:38 AM: 'LASER',
4:28:38 AM: 'Dopamine',
4:28:38 AM: 'MindSpore',
4:28:38 AM: 'HE Lib',
4:28:38 AM: 'd6tflow',
4:28:38 AM: 'Sonnet',
4:28:38 AM: 'Plaid ML',
4:28:38 AM: 'Nyoka',
4:28:38 AM: 'doccano',
4:28:38 AM: 'ecco',
4:28:38 AM: ... 252 more items
4:28:38 AM: ]
4:28:38 AM: ncc: Version 0.34.0
4:28:38 AM: ncc: Compiling file index.js into CJS
4:28:38 AM: ncc: Version 0.34.0
4:28:38 AM: ncc: Compiling file index.js into CJS
4:28:38 AM: ncc: Version 0.34.0
4:28:38 AM: ncc: Compiling file index.js into CJS
4:28:38 AM: Development server running at http://127.0.0.1:4000/
4:28:38 AM: Running integration-test,check-landscape,render-landscape,funding in parallel
4:28:38 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:28:38 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:28:38 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:28:38 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:28:38 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:28:38 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:28:38 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:28:38 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:28:38 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:28:38 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:28:38 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:28:38 AM: api request starting... /api/ids?category=&project=&license=&organization=accord-net-framework&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:28:38 AM: api request starting... /api/ids?category=&project=&license=&organization=microsoft&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:28:38 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:28:38 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:28:38 AM: Task: integration-test PASS specs/main.spec.js (9.624s)
4:28:38 AM: Main test
4:28:38 AM: I visit a main page and have all required elements
4:28:38 AM: ✓ I can open a page (1689ms)
4:28:38 AM: ✓ A proper header is present (6ms)
4:28:38 AM: ✓ Group headers are ok (3ms)
4:28:38 AM: ✓ I see a You are viewing text (2ms)
4:28:38 AM: ✓ A proper card is present (4ms)
4:28:38 AM: ✓ If I click on a card, I see a modal dialog (334ms)
4:28:38 AM: ✓ Closing a browser (25ms)
4:28:38 AM: Landscape Test
4:28:38 AM: ✓ I visit LF AI & Data Members landscape page and have all required elements, elements are clickable (908ms)
4:28:38 AM: ✓ Closing a browser (28ms)
4:28:38 AM: ✓ I visit Companies Hosting Projects landscape page and have all required elements, elements are clickable (878ms)
4:28:38 AM: ✓ Closing a browser (22ms)
4:28:38 AM: I visit a main landscape page and have all required elements
4:28:38 AM: ✓ I open a landscape page and wait for it to load (1848ms)
4:28:38 AM: ✓ When I click on an item the modal is open (90ms)
4:28:38 AM: ✓ If I would straight open the url with a selected id, a modal appears (1865ms)
4:28:38 AM: ✓ Closing a browser (31ms)
4:28:38 AM: Filtering by organization
4:28:38 AM: ✓ Checking we see Accord.NET when filtering by organization Accord.NET Framework (577ms)
4:28:38 AM: ✓ Checking we don't see Accord.NET when filtering by organization Microsoft (239ms)
✓ Closing a browser (23ms)
4:28:38 AM: PASS specs/tools/actualTwitter.spec.js
4:28:38 AM: Twitter URL
4:28:38 AM: when crunchbase data not set
4:28:38 AM: ✓ returns URL from node (2ms)
4:28:38 AM: when node does not have twitter URL
4:28:38 AM: ✓ returns URL from node (1ms)
4:28:38 AM: when node has twitter URL set to null
4:28:38 AM: ✓ returns undefined
4:28:38 AM: when both node and crunchbase have twitter URL
4:28:38 AM: ✓ returns URL from node (1ms)
4:28:38 AM: when twitter URL is not set anywhere
4:28:38 AM: ✓ returns undefined
4:28:38 AM: cleaning up twitter URL
4:28:38 AM: ✓ replaces http with https (1ms)
4:28:38 AM: ✓ removes www
4:28:38 AM: ✓ query string
4:28:38 AM: Test Suites: 2 passed, 2 total
4:28:38 AM: Tests: 26 passed, 26 total
4:28:38 AM: Snapshots: 0 total
4:28:38 AM: Time: 9.889s
4:28:38 AM: Task: check-landscape
4:28:38 AM: Task: render-landscape visiting http://localhost:4000/fullscreen?version=2025-02-13T04:27:33Z 60a7223&scale=false&pdf
4:28:38 AM: visiting http://localhost:4000/fullscreen?version=2025-02-13T04:27:33Z 60a7223&scale=false&pdf
4:28:38 AM: Task: funding From https://github.com/lfai/lfai-landscape
4:28:38 AM: * [new branch] NSouthernLF-patch-1 -> github/NSouthernLF-patch-1
4:28:38 AM: * [new branch] NSouthernLF-patch-2 -> github/NSouthernLF-patch-2
4:28:38 AM: * [new branch] NSouthernLF-patch-3 -> github/NSouthernLF-patch-3
4:28:38 AM: * [new branch] NSouthernLF-patch-3-1 -> github/NSouthernLF-patch-3-1
4:28:38 AM: * [new branch] create-pull-request/patch-1693628094 -> github/create-pull-request/patch-1693628094
4:28:38 AM: * [new branch] create-pull-request/patch-1693714471 -> github/create-pull-request/patch-1693714471
4:28:38 AM: * [new branch] create-pull-request/patch-1703668428 -> github/create-pull-request/patch-1703668428
4:28:38 AM: * [new branch] create-pull-request/patch-1739420663 -> github/create-pull-request/patch-1739420663
4:28:38 AM: * [new branch] main -> github/main
4:28:38 AM: * [new branch] revert-303-main -> github/revert-303-main
4:28:38 AM: * [new branch] web-landscape-2021-10-01T22-18 -> github/web-landscape-2021-10-01T22-18
4:28:38 AM: * [new branch] web-landscape-2021-10-12T16-06 -> github/web-landscape-2021-10-12T16-06
4:28:38 AM: * [new branch] web-landscape-2021-10-18T19-46 -> github/web-landscape-2021-10-18T19-46
4:28:38 AM: * [new branch] web-landscape-2021-10-20T18-58 -> github/web-landscape-2021-10-20T18-58
4:28:42 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:28:43 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:28:43 AM: * Documentation: https://help.ubuntu.com
4:28:43 AM: * Management: https://landscape.canonical.com
4:28:43 AM: * Support: https://ubuntu.com/advantage
4:28:43 AM: System information as of Thu Feb 13 04:28:42 UTC 2025
4:28:43 AM: System load: 2.05
4:28:43 AM: Usage of /: 76.3% of 217.51GB
4:28:43 AM: Memory usage: 18%
4:28:43 AM: Swap usage: 1%
4:28:43 AM: Processes: 655
4:28:43 AM: Users logged in: 1
4:28:43 AM: IPv4 address for bond0: 147.75.199.15
4:28:43 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:28:43 AM: IPv4 address for docker0: 172.17.0.1
4:28:43 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:28:43 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:28:43 AM: 82 updates can be applied immediately.
4:28:43 AM: 7 of these updates are standard security updates.
4:28:43 AM: To see these additional updates run: apt list --upgradable
4:28:43 AM: New release '22.04.5 LTS' available.
4:28:43 AM: Run 'do-release-upgrade' to upgrade to it.
4:28:43 AM: 2 updates could not be installed automatically. For more details,
4:28:43 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:28:43 AM: *** System restart required ***
4:28:43 AM: Remote build done!
4:28:43 AM: Pseudo-terminal will not be allocated because stdin is not a terminal.
4:28:43 AM: Welcome to Ubuntu 20.04.5 LTS (GNU/Linux 5.4.0-153-generic x86_64)
4:28:43 AM: * Documentation: https://help.ubuntu.com
4:28:43 AM: * Management: https://landscape.canonical.com
4:28:43 AM: * Support: https://ubuntu.com/advantage
4:28:43 AM: System information as of Thu Feb 13 04:26:03 UTC 2025
4:28:43 AM: System load: 1.69
4:28:43 AM: Usage of /: 75.8% of 217.51GB
4:28:43 AM: Memory usage: 20%
4:28:43 AM: Swap usage: 1%
4:28:43 AM: Processes: 641
4:28:43 AM: Users logged in: 1
4:28:43 AM: IPv4 address for bond0: 147.75.199.15
4:28:43 AM: IPv6 address for bond0: 2604:1380:45d2:700::3
4:28:43 AM: IPv4 address for docker0: 172.17.0.1
4:28:43 AM: * Strictly confined Kubernetes makes edge and IoT secure. Learn how MicroK8s
4:28:43 AM: just raised the bar for easy, resilient and secure K8s cluster deployment.
4:28:43 AM: 82 updates can be applied immediately.
4:28:43 AM: 7 of these updates are standard security updates.
4:28:43 AM: To see these additional updates run: apt list --upgradable
4:28:43 AM: New release '22.04.5 LTS' available.
4:28:43 AM: Run 'do-release-upgrade' to upgrade to it.
4:28:43 AM: 2 updates could not be installed automatically. For more details,
4:28:43 AM: see /var/log/unattended-upgrades/unattended-upgrades.log
4:28:43 AM: *** System restart required ***
4:28:43 AM: /opt/buildhome/.nvm/nvm.sh
4:28:43 AM: .:
4:28:43 AM: ADOPTERS.md jest.config.js netlify.toml tools
4:28:43 AM: bin landscapes_dev package.json update_server
4:28:43 AM: build.sh landscapes.sh postcss.config.js _yarn.lock
4:28:43 AM: code-of-conduct.md landscapes.yml README.md yarn.lock
4:28:43 AM: files LICENSE server.js
4:28:43 AM: _headers netlify specs
4:28:43 AM: INSTALL.md netlify.md src
4:28:43 AM: v18.3
4:28:43 AM: Downloading and installing node v18.3.0...
4:28:43 AM: Computing checksum with sha256sum
4:28:43 AM: Checksums matched!
4:28:43 AM: Now using node v18.3.0 (npm v8.11.0)
4:28:43 AM: Now using node v18.3.0 (npm v8.11.0)
4:28:43 AM: npm WARN config global `--global`, `--local` are deprecated. Use `--location=global` instead.
4:28:43 AM:
4:28:43 AM: added 3 packages, and audited 4 packages in 510ms
4:28:43 AM: found 0 vulnerabilities
4:28:43 AM: npm WARN config global `--global`, `--local` are deprecated. Use `--location=global` instead.
4:28:43 AM:
4:28:43 AM: removed 12 packages, changed 105 packages, and audited 263 packages in 2s
4:28:43 AM: 27 packages are looking for funding
4:28:43 AM: run `npm fund` for details
4:28:43 AM: found 0 vulnerabilities
4:28:43 AM: added 1 package in 1s
4:28:43 AM: YN0000: ┌ Resolution step
4:28:43 AM: YN0000: └ Completed
4:28:43 AM: YN0000: ┌ Fetch step
4:28:43 AM: YN0013: │ 2 packages were already cached, 808 had to be fetched
4:28:43 AM: YN0000: └ Completed in 6s 214ms
4:28:43 AM: YN0000: ┌ Link step
4:28:43 AM: YN0000: │ ESM support for PnP uses the experimental loader API and is therefore experimental
4:28:43 AM: YN0007: │ puppeteer@npm:14.2.1 must be built because it never has been before or the last one failed
4:28:43 AM: YN0007: │ puppeteer@npm:13.2.0 must be built because it never has been before or the last one failed
4:28:43 AM: YN0007: │ yarn@npm:1.22.18 must be built because it never has been before or the last one failed
4:28:43 AM: YN0000: │ puppeteer@npm:13.2.0 STDERR
4:28:43 AM: YN0000: │ puppeteer@npm:14.2.1 STDERR
4:28:43 AM: YN0000: │ puppeteer@npm:13.2.0 STDOUT Chromium (961656) downloaded to /opt/repo/packageRemote/.yarn/unplugged/puppeteer-npm-13.2.0-5595d43df8/node_modules/puppeteer/.local-chromium/linux-961656
4:28:43 AM: YN0000: │ puppeteer@npm:14.2.1 STDOUT Chromium (1002410) downloaded to /opt/repo/packageRemote/.yarn/unplugged/puppeteer-npm-14.2.1-e2757bbf6c/node_modules/puppeteer/.local-chromium/linux-1002410
4:28:43 AM: YN0000: └ Completed in 6s 818ms
4:28:43 AM: YN0000: Done with warnings in 13s 452ms
4:28:43 AM: Processing the tree
4:28:43 AM: Running with a level=easy. Settings:
4:28:43 AM: Use cached crunchbase data: true
4:28:43 AM: Use cached images data: true
4:28:43 AM: Use cached twitter data: true
4:28:43 AM: Use cached github basic stats: true
4:28:43 AM: Use cached github start dates: true
4:28:43 AM: Use cached best practices: true
4:28:43 AM: Fetching crunchbase entries
4:28:43 AM: ................................................................................
4:28:43 AM: ................................................................................
4:28:43 AM: ................................................................................
4:28:43 AM: ................................................................................
4:28:43 AM: ................................................................................
4:28:43 AM: ....................................................**
4:28:43 AM: Fetching github entries
4:28:43 AM: ................................................................................
4:28:43 AM: ................................................................................
4:28:43 AM: ..................................*********************.........................
4:28:43 AM: ................................................................................
4:28:43 AM: ...................................................................*********
4:28:43 AM: Fetching start date entries
4:28:43 AM: ................................................................................
4:28:43 AM: ................................................................................
4:28:43 AM: ............................................***********.........................
4:28:43 AM: ................................................................................
4:28:43 AM: .........................................................*******************
4:28:43 AM: Fetching images
4:28:43 AM: got image entries
4:28:43 AM: Hash for Prefect is prefect-2
4:28:43 AM: ................................................................................
4:28:43 AM: ....**......**..................................................................
4:28:43 AM: ................................................................................
4:28:43 AM: ................................................................................
4:28:43 AM: ................................................................................
4:28:43 AM: ................................................................................
4:28:43 AM: Fetching last tweet dates
4:28:43 AM: Fetching best practices
4:28:43 AM: ................................................................................
4:28:43 AM: ................................................................................
4:28:43 AM: ................................................................................
4:28:43 AM: ................................................................................
4:28:43 AM: ...............................................
4:28:43 AM: Fetching CLOMonitor data
4:28:43 AM: Processing the tree
4:28:43 AM: saving!
4:28:43 AM: Hash for Prefect is prefect-2
4:28:43 AM: Warning: Open Platform for Enterprise AI (OPEA) has a twitter https://twitter.com/opeadev which is invalid or we just can not fetch its tweets
4:28:43 AM: Warning: Dragonfly has a twitter https://twitter.com/dragonfly_oss which is invalid or we just can not fetch its tweets
4:28:43 AM: Warning: Vald has a twitter https://twitter.com/vdaas_vald which is invalid or we just can not fetch its tweets
4:28:43 AM: Warning: FastTrackML has a twitter https://twitter.com/oss_gr which is invalid or we just can not fetch its tweets
4:28:43 AM: Warning: Mathesar has a twitter https://twitter.com/centerofci which is invalid or we just can not fetch its tweets
4:28:43 AM: Warning: RWKV has a twitter https://twitter.com/RWKV_AI which is invalid or we just can not fetch its tweets
4:28:43 AM: Warning: Langchain has a twitter https://twitter.com/LangChainAI which is invalid or we just can not fetch its tweets
4:28:43 AM: Warning: Haystack has a twitter https://twitter.com/deepset_ai which is invalid or we just can not fetch its tweets
4:28:43 AM: Warning: Armada has a twitter https://twitter.com/oss_gr which is invalid or we just can not fetch its tweets
4:28:43 AM: Warning: envd has a twitter https://twitter.com/TensorChord which is invalid or we just can not fetch its tweets
4:28:43 AM: Warning: NVIDIA Corporation has a twitter https://twitter.com/nvidia which is invalid or we just can not fetch its tweets
4:28:43 AM: Warning: Advanced Micro Devices (AMD) has a twitter https://twitter.com/AMD which is invalid or we just can not fetch its tweets
4:28:43 AM: Warning: Alibaba Cloud (Singapore) Private LTD has a twitter https://twitter.com/AlibabaB2B which is invalid or we just can not fetch its tweets
4:28:43 AM: Warning: Fujitsu Limited has a twitter https://twitter.com/Fujitsu_Global which is invalid or we just can not fetch its tweets
4:28:43 AM: Warning: GitCode has a twitter https://twitter.com/turingbook which is invalid or we just can not fetch its tweets
4:28:43 AM: Warning: Neo4j, Inc. has a twitter https://twitter.com/neo4j which is invalid or we just can not fetch its tweets
4:28:43 AM: Warning: Posit has a twitter https://twitter.com/posit_pbc which is invalid or we just can not fetch its tweets
4:28:43 AM: Warning: Snyk Limited has a twitter https://twitter.com/snyksec which is invalid or we just can not fetch its tweets
4:28:43 AM: Warning: AI for People has a twitter https://twitter.com/AIforPeople which is invalid or we just can not fetch its tweets
4:28:43 AM: Warning: Ambianic.ai has a twitter https://twitter.com/ambianicai which is invalid or we just can not fetch its tweets
4:28:43 AM: Warning: Columbia University has a twitter https://twitter.com/Columbia which is invalid or we just can not fetch its tweets
4:28:43 AM: Warning: Common Crawl Foundation has a twitter https://twitter.com/commoncrawl which is invalid or we just can not fetch its tweets
4:28:43 AM: Warning: Ersilia Open Source Initiative has a twitter https://twitter.com/ersiliaio which is invalid or we just can not fetch its tweets
4:28:43 AM: Warning: Ewha Institute for Biomedical Law & Ethics has a twitter https://twitter.com/cybercampus which is invalid or we just can not fetch its tweets
4:28:43 AM: Warning: Montreal AI Ethics Institute has a twitter https://twitter.com/mtlaiethics which is invalid or we just can not fetch its tweets
4:28:43 AM: Warning: NIPA has a twitter https://twitter.com/NIPAkr which is invalid or we just can not fetch its tweets
4:28:43 AM: Warning: One Fact Foundation has a twitter https://twitter.com/onefact_org which is invalid or we just can not fetch its tweets
4:28:43 AM: Warning: Pranveer Singh Institute Of Technology has a twitter https://twitter.com/PSITKanpur2004 which is invalid or we just can not fetch its tweets
4:28:43 AM: Warning: Rensselaer Polytechnic Institute has a twitter https://twitter.com/rpi which is invalid or we just can not fetch its tweets
4:28:43 AM: Warning: Transport for NSW has a twitter https://twitter.com/TransportforNSW which is invalid or we just can not fetch its tweets
4:28:43 AM: Warning: University of Michigan has a twitter https://twitter.com/UMich which is invalid or we just can not fetch its tweets
4:28:43 AM: Warning: Wikimedia Deutschland e. V. has a twitter https://twitter.com/WikimediaDE which is invalid or we just can not fetch its tweets
4:28:43 AM: Warning: World Ethical Data Foundation has a twitter https://twitter.com/WEDF_foundation which is invalid or we just can not fetch its tweets
4:28:43 AM: Warning: BasicAI(hosting) has a twitter https://twitter.com/BasicAIteam which is invalid or we just can not fetch its tweets
4:28:43 AM: Warning: Fujitsu Limited(hosting) has a twitter https://twitter.com/Fujitsu_Global which is invalid or we just can not fetch its tweets
4:28:43 AM: Warning: Owkin(hosting) has a twitter https://twitter.com/owkinscience which is invalid or we just can not fetch its tweets
4:28:43 AM: Fetching members from LF AI & Data Member Company category
4:28:43 AM: Processing the tree
4:28:43 AM: Assigning Premier membership on DataHub (LinkedIn) because its parent Microsoft has Premier membership
4:28:43 AM: Assigning Premier membership on Brooklin (LinkedIn) because its parent Microsoft has Premier membership
4:28:43 AM: Assigning Premier membership on Azkaban (LinkedIn) because its parent Microsoft has Premier membership
4:28:43 AM: Assigning Premier membership on LinkedIn (hosting) (LinkedIn) because its parent Microsoft has Premier membership
4:28:43 AM: Hash for Chaitanya Bharathi Institute of Technology is chaitanya-bharathi-institute-of-technology-2
4:28:43 AM: Hash for Fast.ai is fast-ai-2
4:28:43 AM: Hash for Great Expectations is great-expectations-2
4:28:43 AM: Hash for ML Perf is ml-perf-2
4:28:43 AM: Hash for PipelineAI is pipeline-ai-2
4:28:43 AM: Hash for Prefect is prefect-2
4:28:43 AM: Hash for Redash is redash-2
4:28:43 AM: Hash for Wikimedia Deutschland e.V. is wikimedia-deutschland-e-v-2
4:28:43 AM: {
4:28:43 AM: name: 'Accord.NET',
4:28:43 AM: homepage_url: 'http://accord-framework.net/',
4:28:43 AM: logo: 'accord-net.svg',
4:28:43 AM: github_data: {
4:28:43 AM: languages: [
4:28:43 AM: [Object], [Object],
4:28:43 AM: [Object], [Object],
4:28:43 AM: [Object], [Object],
4:28:43 AM: [Object], [Object],
4:28:43 AM: [Object], [Object],
4:28:43 AM: [Object], [Object],
4:28:43 AM: [Object], [Object],
4:28:43 AM: [Object]
4:28:43 AM: ],
4:28:43 AM: contributions: '0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0',
4:28:43 AM: firstWeek: '2022-11-27Z',
4:28:43 AM: stars: 4404,
4:28:43 AM: license: 'GNU Lesser General Public License v2.1',
4:28:43 AM: description: 'Machine learning, computer vision, statistics and general scientific computing for .NET',
4:28:43 AM: latest_commit_date: '2020-11-18T19:53:01Z',
4:28:43 AM: latest_commit_link: '/accord-net/framework/commit/1ab0cc0ba55bcc3d46f20e7bbe7224b58cd01854',
4:28:43 AM: release_date: '2017-10-19T21:00:56Z',
4:28:43 AM: contributors_count: 98,
4:28:43 AM: },
4:28:43 AM: repos: [ { url: 'https://github.com/accord-net/framework', stars: 4404 } ],
4:28:43 AM: github_start_commit_data: {
4:28:43 AM: start_commit_link: '/accord-net/framework/commit/5e834efc89c071bb32c84f41124d5c9a5bbfe1e3',
4:28:43 AM: start_date: '2012-04-08T14:05:58Z'
4:28:43 AM: },
4:28:43 AM: image_data: {
4:28:43 AM: fileName: 'accord-net.svg',
4:28:43 AM: hash: 'tdvFWfo6hfZQqHqjscAtcoZWk6qztZFzBTVqKWUjYig='
4:28:43 AM: },
4:28:43 AM: firstCommitDate: '2012-04-08T14:05:58Z',
4:28:43 AM: latestCommitDate: '2020-11-18T19:53:01Z',
4:28:43 AM: releaseDate: '2017-10-19T21:00:56Z',
4:28:43 AM: commitsThisYear: 0,
4:28:43 AM: contributorsCount: 98,
4:28:43 AM: language: 'C#',
4:28:43 AM: stars: 4404,
4:28:43 AM: license: 'GNU Lesser General Public License v2.1',
4:28:43 AM: headquarters: 'Grenoble, France',
4:28:43 AM: description: 'Machine learning, computer vision, statistics and general scientific computing for .NET',
4:28:43 AM: organization: 'Accord.NET Framework',
4:28:43 AM: crunchbaseData: {
4:28:43 AM: name: 'Accord.NET Framework',
4:28:43 AM: description: 'Machine Learning Framework',
4:28:43 AM: homepage: 'http://accord-framework.net/',
4:28:43 AM: city: 'Grenoble',
4:28:43 AM: region: 'Rhone-Alpes',
4:28:43 AM: country: 'France',
4:28:43 AM: twitter: null,
4:28:43 AM: linkedin: null,
4:28:43 AM: acquisitions: [],
4:28:43 AM: parents: [],
4:28:43 AM: stockExchange: null,
4:28:43 AM: company_type: 'Non Profit',
4:28:43 AM: industries: [
4:28:43 AM: 'Analytics',
4:28:43 AM: 'Artificial Intelligence',
4:28:43 AM: 'Hardware',
4:28:43 AM: 'Machine Learning'
4:28:43 AM: ],
4:28:43 AM: numEmployeesMin: null,
4:28:43 AM: numEmployeesMax: null
4:28:43 AM: },
4:28:43 AM: path: 'Machine Learning / Framework',
4:28:43 AM: landscape: 'Machine Learning / Framework',
4:28:43 AM: category: 'Machine Learning',
4:28:43 AM: amount: 'N/A',
4:28:43 AM: oss: true,
4:28:43 AM: href: 'logos/accord-net.svg',
4:28:43 AM: bestPracticeBadgeId: false,
4:28:43 AM: bestPracticePercentage: null,
4:28:43 AM: industries: [
4:28:43 AM: 'Analytics',
4:28:43 AM: 'Artificial Intelligence',
4:28:43 AM: 'Hardware',
4:28:43 AM: 'Machine Learning'
4:28:43 AM: ],
4:28:43 AM: starsPresent: true,
4:28:43 AM: starsAsText: '4,404',
4:28:43 AM: marketCapPresent: false,
4:28:43 AM: marketCapAsText: 'N/A',
4:28:43 AM: id: 'accord-net',
4:28:43 AM: flatName: 'Accord.NET',
4:28:43 AM: member: false,
4:28:43 AM: relation: false,
4:28:43 AM: isSubsidiaryProject: false
4:28:43 AM: } 2020-11-18T19:53:01Z
4:28:43 AM: [
4:28:43 AM: 'Community Data License Agreement (CDLA)',
4:28:43 AM: 'PlaNet',
4:28:43 AM: 'Generic Neural Elastic Search (GNES)',
4:28:43 AM: 'PredictionIO',
4:28:43 AM: 'ELI5',
4:28:43 AM: 'BERT',
4:28:43 AM: 'Nauta',
4:28:43 AM: 'DAWNBench',
4:28:43 AM: 'AresDB',
4:28:43 AM: 'dotmesh',
4:28:43 AM: 'Audit AI',
4:28:43 AM: 'euler',
4:28:43 AM: 'Clipper',
4:28:43 AM: 'Accord.NET',
4:28:43 AM: 'Shogun',
4:28:43 AM: 'DELTA',
4:28:43 AM: 'BeakerX',
4:28:43 AM: 'PixieDust',
4:28:43 AM: 'TreeInterpreter',
4:28:43 AM: 'Cyclone',
4:28:43 AM: 'Lucid',
4:28:43 AM: 'XLM',
4:28:43 AM: 'Chainer RL',
4:28:43 AM: 'ForestFlow',
4:28:43 AM: 'uReplicator',
4:28:43 AM: 'Elastic Deep Learning (EDL)',
4:28:43 AM: 'Kashgari',
4:28:43 AM: 'X-DeepLearning',
4:28:43 AM: 'LIME',
4:28:43 AM: 'Model Asset eXchange (MAX)',
4:28:43 AM: 'TransmogrifAI',
4:28:43 AM: 'OpenBytes',
4:28:43 AM: 'DeepLIFT',
4:28:43 AM: 'Onepanel',
4:28:43 AM: 'DeepSpeech',
4:28:43 AM: 'Lucene',
4:28:43 AM: 'Turi Create',
4:28:43 AM: 'Visual Object Tagging Tool (VoTT)',
4:28:43 AM: 'Acumos',
4:28:43 AM: 'Skater',
4:28:43 AM: 'Catalyst',
4:28:43 AM: 'SKIP Language',
4:28:43 AM: 'SQLFlow',
4:28:43 AM: 'Advertorch',
4:28:43 AM: 'xLearn',
4:28:43 AM: 'Neuropod',
4:28:43 AM: 'AdvBox',
4:28:43 AM: 'RCloud',
4:28:43 AM: 'Neo-AI',
4:28:43 AM: 'Embedded Learning Library',
4:28:43 AM: 'Stable Baselines',
4:28:43 AM: 'talos',
4:28:43 AM: 'LabelImg',
4:28:43 AM: 'MMdnn',
4:28:43 AM: 'CNTK',
4:28:43 AM: 'Machine Learning eXchange',
4:28:43 AM: 'Singularity',
4:28:43 AM: 'Chainer',
4:28:43 AM: 'PyText',
4:28:43 AM: 'Pipeline.ai',
4:28:43 AM: 'Apache Bahir',
4:28:43 AM: 'NLP Architect',
4:28:43 AM: 'AllenNLP',
4:28:43 AM: 'Angel-ML',
4:28:43 AM: 'SEED RL',
4:28:43 AM: 'Coach',
4:28:43 AM: 'Gluon-NLP',
4:28:43 AM: 'DeepMind Lab',
4:28:43 AM: 'SEAL',
4:28:43 AM: 'MXNet',
4:28:43 AM: 'OpenAI Gym',
4:28:43 AM: 'MindMeld',
4:28:43 AM: 'CleverHans',
4:28:43 AM: 'Petastorm',
4:28:43 AM: 'Hawq',
4:28:43 AM: 'TF Encrypted',
4:28:43 AM: 'faust',
4:28:43 AM: 'Cortex',
4:28:43 AM: 'OpenDataology',
4:28:43 AM: 'YouTokenToMe',
4:28:43 AM: 'ALBERT',
4:28:43 AM: 'Adlik',
4:28:43 AM: '1chipML',
4:28:43 AM: 'Neural Network Distiller',
4:28:43 AM: 'Labelbox',
4:28:43 AM: 'Facets',
4:28:43 AM: 'OpenNN',
4:28:43 AM: 'Pilosa',
4:28:43 AM: 'Orchest',
4:28:43 AM: 'Model Server for Apache MXNet',
4:28:43 AM: 'LASER',
4:28:43 AM: 'Dopamine',
4:28:43 AM: 'MindSpore',
4:28:43 AM: 'HE Lib',
4:28:43 AM: 'd6tflow',
4:28:43 AM: 'Sonnet',
4:28:43 AM: 'Plaid ML',
4:28:43 AM: 'Nyoka',
4:28:43 AM: 'doccano',
4:28:43 AM: 'ecco',
4:28:43 AM: ... 252 more items
4:28:43 AM: ]
4:28:43 AM: ncc: Version 0.34.0
4:28:43 AM: ncc: Compiling file index.js into CJS
4:28:43 AM: ncc: Version 0.34.0
4:28:43 AM: ncc: Compiling file index.js into CJS
4:28:43 AM: ncc: Version 0.34.0
4:28:43 AM: ncc: Compiling file index.js into CJS
4:28:43 AM: Development server running at http://127.0.0.1:4000/
4:28:43 AM: Running integration-test,check-landscape,render-landscape,funding in parallel
4:28:43 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:28:43 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:28:43 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:28:43 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:28:43 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:28:43 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:28:43 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=main
4:28:43 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:28:43 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:28:43 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:28:43 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:28:43 AM: api request starting... /api/ids?category=&project=&license=&organization=accord-net-framework&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:28:43 AM: api request starting... /api/ids?category=&project=&license=&organization=microsoft&headquarters=&company-type=&industries=&sort=name&grouping=project&bestpractices=&enduser=&parent=&language=&specification=&format=card
4:28:43 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=members
4:28:43 AM: api request starting... /api/ids?category=&project=&license=&organization=&headquarters=&company-type=&industries=&sort=name&grouping=no&bestpractices=&enduser=&parent=&language=&specification=&format=hosting
4:28:43 AM: Task: integration-test PASS specs/main.spec.js (9.624s)
4:28:43 AM: Main test
4:28:43 AM: I visit a main page and have all required elements
4:28:43 AM: ✓ I can open a page (1689ms)
4:28:43 AM: ✓ A proper header is present (6ms)
4:28:43 AM: ✓ Group headers are ok (3ms)
4:28:43 AM: ✓ I see a You are viewing text (2ms)
4:28:43 AM: ✓ A proper card is present (4ms)
4:28:43 AM: ✓ If I click on a card, I see a modal dialog (334ms)
4:28:43 AM: ✓ Closing a browser (25ms)
4:28:43 AM: Landscape Test
4:28:43 AM: ✓ I visit LF AI & Data Members landscape page and have all required elements, elements are clickable (908ms)
4:28:43 AM: ✓ Closing a browser (28ms)
4:28:43 AM: ✓ I visit Companies Hosting Projects landscape page and have all required elements, elements are clickable (878ms)
4:28:43 AM: ✓ Closing a browser (22ms)
4:28:43 AM: I visit a main landscape page and have all required elements
4:28:43 AM: ✓ I open a landscape page and wait for it to load (1848ms)
4:28:43 AM: ✓ When I click on an item the modal is open (90ms)
4:28:43 AM: ✓ If I would straight open the url with a selected id, a modal appears (1865ms)
4:28:43 AM: ✓ Closing a browser (31ms)
4:28:43 AM: Filtering by organization
4:28:43 AM: ✓ Checking we see Accord.NET when filtering by organization Accord.NET Framework (577ms)
4:28:43 AM: ✓ Checking we don't see Accord.NET when filtering by organization Microsoft (239ms)
✓ Closing a browser (23ms)
4:28:43 AM: PASS specs/tools/actualTwitter.spec.js
4:28:43 AM: Twitter URL
4:28:43 AM: when crunchbase data not set
4:28:43 AM: ✓ returns URL from node (2ms)
4:28:43 AM: when node does not have twitter URL
4:28:43 AM: ✓ returns URL from node (1ms)
4:28:43 AM: when node has twitter URL set to null
4:28:43 AM: ✓ returns undefined
4:28:43 AM: when both node and crunchbase have twitter URL
4:28:43 AM: ✓ returns URL from node (1ms)
4:28:43 AM: when twitter URL is not set anywhere
4:28:43 AM: ✓ returns undefined
4:28:43 AM: cleaning up twitter URL
4:28:43 AM: ✓ replaces http with https (1ms)
4:28:43 AM: ✓ removes www
4:28:43 AM: ✓ query string
4:28:43 AM: Test Suites: 2 passed, 2 total
4:28:43 AM: Tests: 26 passed, 26 total
4:28:43 AM: Snapshots: 0 total
4:28:43 AM: Time: 9.889s
4:28:43 AM: Task: check-landscape
4:28:43 AM: Task: render-landscape visiting http://localhost:4000/fullscreen?version=2025-02-13T04:27:33Z 60a7223&scale=false&pdf
4:28:43 AM: visiting http://localhost:4000/fullscreen?version=2025-02-13T04:27:33Z 60a7223&scale=false&pdf
4:28:43 AM: Task: funding From https://github.com/lfai/lfai-landscape
4:28:43 AM: * [new branch] NSouthernLF-patch-1 -> github/NSouthernLF-patch-1
4:28:43 AM: * [new branch] NSouthernLF-patch-2 -> github/NSouthernLF-patch-2
4:28:43 AM: * [new branch] NSouthernLF-patch-3 -> github/NSouthernLF-patch-3
4:28:43 AM: * [new branch] NSouthernLF-patch-3-1 -> github/NSouthernLF-patch-3-1
4:28:43 AM: * [new branch] create-pull-request/patch-1693628094 -> github/create-pull-request/patch-1693628094
4:28:43 AM: * [new branch] create-pull-request/patch-1693714471 -> github/create-pull-request/patch-1693714471
4:28:43 AM: * [new branch] create-pull-request/patch-1703668428 -> github/create-pull-request/patch-1703668428
4:28:43 AM: * [new branch] create-pull-request/patch-1739420663 -> github/create-pull-request/patch-1739420663
4:28:43 AM: * [new branch] main -> github/main
4:28:43 AM: * [new branch] revert-303-main -> github/revert-303-main
4:28:43 AM: * [new branch] web-landscape-2021-10-01T22-18 -> github/web-landscape-2021-10-01T22-18
4:28:43 AM: * [new branch] web-landscape-2021-10-12T16-06 -> github/web-landscape-2021-10-12T16-06
4:28:43 AM: * [new branch] web-landscape-2021-10-18T19-46 -> github/web-landscape-2021-10-18T19-46
4:28:43 AM: * [new branch] web-landscape-2021-10-20T18-58 -> github/web-landscape-2021-10-20T18-58
4:28:44 AM: ​
4:28:44 AM: (build.command completed in 3m 37.2s)
4:28:44 AM:
4:28:44 AM: Functions bundling
4:28:44 AM: ────────────────────────────────────────────────────────────────
4:28:44 AM: ​
4:28:44 AM: Packaging Functions from /opt/build/repo/functions directory:
4:28:44 AM: - export.js
4:28:44 AM: - ids.js
4:28:44 AM: - items.js
4:28:44 AM: ​
4:28:49 AM: ​
4:28:49 AM: (Functions bundling completed in 4.8s)
4:28:49 AM:
4:29:00 AM: (Netlify Build completed in 3m 52.6s)
4:29:01 AM: Section completed: building
4:29:07 AM: Finished processing build request in 4m25.593s

Deploying

Complete
4:28:49 AM: Deploy site
4:28:49 AM: ────────────────────────────────────────────────────────────────
4:28:49 AM: ​
4:28:49 AM: Starting to deploy site from 'dist'
4:28:50 AM: Calculating files to upload
4:28:53 AM: 66 new file(s) to upload
4:28:53 AM: 3 new function(s) to upload
4:29:00 AM: Section completed: deploying
4:29:00 AM: Site deploy was successfully initiated
4:29:00 AM: ​
4:29:00 AM: (Deploy site completed in 10.4s)

Cleanup

Complete
4:29:00 AM: Netlify Build Complete
4:29:00 AM: ────────────────────────────────────────────────────────────────
4:29:00 AM: ​
4:29:01 AM: Caching artifacts
4:29:01 AM: Started saving build plugins
4:29:01 AM: Finished saving build plugins
4:29:01 AM: Started saving mise cache
4:29:01 AM: Finished saving mise cache
4:29:01 AM: Started saving pip cache
4:29:01 AM: Finished saving pip cache
4:29:01 AM: Started saving emacs cask dependencies
4:29:01 AM: Finished saving emacs cask dependencies
4:29:01 AM: Started saving maven dependencies
4:29:01 AM: Finished saving maven dependencies
4:29:01 AM: Started saving boot dependencies
4:29:01 AM: Finished saving boot dependencies
4:29:01 AM: Started saving rust rustup cache
4:29:01 AM: Finished saving rust rustup cache
4:29:01 AM: Started saving go dependencies
4:29:01 AM: Finished saving go dependencies
4:29:01 AM: Build script success
4:29:06 AM: Uploading Cache of size 195.3MB
4:29:07 AM: Section completed: cleanup

Post-processing

Complete
4:29:00 AM: Starting post processing
4:29:00 AM: Post processing - redirect rules
4:29:00 AM: Post processing done
4:29:00 AM: Section completed: postprocessing
4:29:00 AM: Skipping form detection
4:29:00 AM: Post processing - header rules
4:29:01 AM: Site is live ✨