Back to Multiple platform build/check report for BioC 3.18: simplified long |
|
This page was generated on 2023-11-02 11:41:10 -0400 (Thu, 02 Nov 2023).
Hostname | OS | Arch (*) | R version | Installed pkgs |
---|---|---|---|---|
nebbiolo2 | Linux (Ubuntu 22.04.2 LTS) | x86_64 | 4.3.1 (2023-06-16) -- "Beagle Scouts" | 4729 |
palomino4 | Windows Server 2022 Datacenter | x64 | 4.3.1 (2023-06-16 ucrt) -- "Beagle Scouts" | 4463 |
lconway | macOS 12.6.5 Monterey | x86_64 | 4.3.1 Patched (2023-06-17 r84564) -- "Beagle Scouts" | 4478 |
kunpeng2 | Linux (openEuler 22.03 LTS-SP1) | aarch64 | 4.3.1 (2023-06-16) -- "Beagle Scouts" | 4464 |
Click on any hostname to see more info about the system (e.g. compilers) (*) as reported by 'uname -p', except on Windows and Mac OS X |
Package 1732/2266 | Hostname | OS / Arch | INSTALL | BUILD | CHECK | BUILD BIN | ||||||||
RGMQL 1.22.0 (landing page) Simone Pallotta
| nebbiolo2 | Linux (Ubuntu 22.04.2 LTS) / x86_64 | OK | OK | WARNINGS | |||||||||
palomino4 | Windows Server 2022 Datacenter / x64 | OK | OK | ERROR | OK | |||||||||
lconway | macOS 12.6.5 Monterey / x86_64 | ... NOT SUPPORTED ... | ||||||||||||
kjohnson1 | macOS 13.6.1 Ventura / arm64 | see weekly results here | ||||||||||||
kunpeng2 | Linux (openEuler 22.03 LTS-SP1) / aarch64 | OK | ERROR | skipped | ||||||||||
To the developers/maintainers of the RGMQL package: - Allow up to 24 hours (and sometimes 48 hours) for your latest push to git@git.bioconductor.org:packages/RGMQL.git to reflect on this report. See Troubleshooting Build Report for more information. - Use the following Renviron settings to reproduce errors and warnings. - If 'R CMD check' started to fail recently on the Linux builder(s) over a missing dependency, add the missing dependency to 'Suggests:' in your DESCRIPTION file. See Renviron.bioc for more information. - See Martin Grigorov's blog post for how to debug Linux ARM64 related issues on a x86_64 host. |
Package: RGMQL |
Version: 1.22.0 |
Command: /home/biocbuild/R/R-4.3.1/bin/R CMD build --keep-empty-dirs --no-resave-data RGMQL |
StartedAt: 2023-11-02 05:10:09 -0000 (Thu, 02 Nov 2023) |
EndedAt: 2023-11-02 05:11:14 -0000 (Thu, 02 Nov 2023) |
EllapsedTime: 65.1 seconds |
RetCode: 1 |
Status: ERROR |
PackageFile: None |
PackageFileSize: NA |
############################################################################## ############################################################################## ### ### Running command: ### ### /home/biocbuild/R/R-4.3.1/bin/R CMD build --keep-empty-dirs --no-resave-data RGMQL ### ############################################################################## ############################################################################## * checking for file ‘RGMQL/DESCRIPTION’ ... OK * preparing ‘RGMQL’: * checking DESCRIPTION meta-information ... OK * installing the package to build vignettes * creating vignettes ... ERROR --- re-building ‘RGMQL-vignette.Rmd’ using rmarkdown Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 23/11/02 05:11:13 INFO SparkContext: Running Spark version 2.2.0 23/11/02 05:11:13 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 23/11/02 05:11:13 INFO SparkContext: Submitted application: GMQL-R 23/11/02 05:11:13 INFO SecurityManager: Changing view acls to: biocbuild 23/11/02 05:11:13 INFO SecurityManager: Changing modify acls to: biocbuild 23/11/02 05:11:13 INFO SecurityManager: Changing view acls groups to: 23/11/02 05:11:13 INFO SecurityManager: Changing modify acls groups to: 23/11/02 05:11:13 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(biocbuild); groups with view permissions: Set(); users with modify permissions: Set(biocbuild); groups with modify permissions: Set() 23/11/02 05:11:13 INFO PlatformDependent: Your platform does not provide complete low-level API for accessing direct buffers reliably. Unless explicitly requested, heap buffer will always be preferred to avoid potential system unstability. 23/11/02 05:11:14 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 23/11/02 05:11:14 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 23/11/02 05:11:14 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 23/11/02 05:11:14 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 23/11/02 05:11:14 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 23/11/02 05:11:14 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 23/11/02 05:11:14 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 23/11/02 05:11:14 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 23/11/02 05:11:14 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 23/11/02 05:11:14 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 23/11/02 05:11:14 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 23/11/02 05:11:14 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 23/11/02 05:11:14 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 23/11/02 05:11:14 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 23/11/02 05:11:14 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 23/11/02 05:11:14 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address. 23/11/02 05:11:14 ERROR SparkContext: Error initializing SparkContext. java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address. at java.base/sun.nio.ch.Net.bind0(Native Method) at java.base/sun.nio.ch.Net.bind(Net.java:555) at java.base/sun.nio.ch.ServerSocketChannelImpl.netBind(ServerSocketChannelImpl.java:337) at java.base/sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:294) at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:127) at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:501) at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1218) at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:496) at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:481) at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:965) at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:210) at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:353) at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:399) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:446) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144) at java.base/java.lang.Thread.run(Thread.java:833) 23/11/02 05:11:14 INFO SparkContext: Successfully stopped SparkContext Quitting from lines 250-251 [init] (RGMQL-vignette.Rmd) Error: processing vignette 'RGMQL-vignette.Rmd' failed with diagnostics: java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address. --- failed re-building ‘RGMQL-vignette.Rmd’ SUMMARY: processing the following file failed: ‘RGMQL-vignette.Rmd’ Error: Vignette re-building failed. Execution halted