renaissance-movie-lens_0

[2025-10-22T14:25:58.310Z] Running test renaissance-movie-lens_0 ... [2025-10-22T14:25:58.310Z] =============================================== [2025-10-22T14:25:58.310Z] renaissance-movie-lens_0 Start Time: Wed Oct 22 14:25:53 2025 Epoch Time (ms): 1761143153182 [2025-10-22T14:25:58.310Z] variation: NoOptions [2025-10-22T14:25:58.310Z] JVM_OPTIONS: [2025-10-22T14:25:58.310Z] { \ [2025-10-22T14:25:58.310Z] echo ""; echo "TEST SETUP:"; \ [2025-10-22T14:25:58.310Z] echo "Nothing to be done for setup."; \ [2025-10-22T14:25:58.310Z] mkdir -p "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_riscv64_linux_testList_1/aqa-tests/TKG/../TKG/output_17611376318083/renaissance-movie-lens_0"; \ [2025-10-22T14:25:58.310Z] cd "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_riscv64_linux_testList_1/aqa-tests/TKG/../TKG/output_17611376318083/renaissance-movie-lens_0"; \ [2025-10-22T14:25:58.310Z] echo ""; echo "TESTING:"; \ [2025-10-22T14:25:58.310Z] "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_riscv64_linux_testList_1/jdkbinary/j2sdk-image/bin/java" --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.util=ALL-UNNAMED --add-opens java.base/java.util.concurrent=ALL-UNNAMED --add-opens java.base/java.nio=ALL-UNNAMED --add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens java.base/java.lang.invoke=ALL-UNNAMED -jar "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_riscv64_linux_testList_1/aqa-tests/TKG/../../jvmtest/perf/renaissance/renaissance.jar" --json ""/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_riscv64_linux_testList_1/aqa-tests/TKG/../TKG/output_17611376318083/renaissance-movie-lens_0"/movie-lens.json" movie-lens; \ [2025-10-22T14:25:58.310Z] if [ $? -eq 0 ]; then echo "-----------------------------------"; echo "renaissance-movie-lens_0""_PASSED"; echo "-----------------------------------"; cd /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_riscv64_linux_testList_1/aqa-tests/TKG/..; rm -f -r "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_riscv64_linux_testList_1/aqa-tests/TKG/../TKG/output_17611376318083/renaissance-movie-lens_0"; else echo "-----------------------------------"; echo "renaissance-movie-lens_0""_FAILED"; echo "-----------------------------------"; fi; \ [2025-10-22T14:25:58.310Z] echo ""; echo "TEST TEARDOWN:"; \ [2025-10-22T14:25:58.310Z] echo "Nothing to be done for teardown."; \ [2025-10-22T14:25:58.310Z] } 2>&1 | tee -a "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_riscv64_linux_testList_1/aqa-tests/TKG/../TKG/output_17611376318083/TestTargetResult"; [2025-10-22T14:25:58.310Z] [2025-10-22T14:25:58.310Z] TEST SETUP: [2025-10-22T14:25:58.310Z] Nothing to be done for setup. [2025-10-22T14:25:58.310Z] [2025-10-22T14:25:58.310Z] TESTING: [2025-10-22T14:26:17.397Z] NOTE: 'movie-lens' benchmark uses Spark local executor with 4 (out of 4) threads. [2025-10-22T14:26:50.698Z] 14:26:48.465 WARN [dispatcher-event-loop-2] org.apache.spark.scheduler.TaskSetManager - Stage 8 contains a task of very large size (1401 KiB). The maximum recommended task size is 1000 KiB. [2025-10-22T14:27:01.488Z] Got 100004 ratings from 671 users on 9066 movies. [2025-10-22T14:27:04.434Z] Training: 60056, validation: 20285, test: 19854 [2025-10-22T14:27:04.434Z] ====== movie-lens (apache-spark) [default], iteration 0 started ====== [2025-10-22T14:27:04.434Z] GC before operation: completed in 572.221 ms, heap usage 359.646 MB -> 77.023 MB. [2025-10-22T14:27:32.166Z] RMSE (validation) = 3.6219689545487617 for the model trained with rank = 8, lambda = 5.0, and numIter = 20. [2025-10-22T14:27:48.029Z] RMSE (validation) = 2.1340923218923 for the model trained with rank = 10, lambda = 2.0, and numIter = 20. [2025-10-22T14:28:03.981Z] RMSE (validation) = 1.310518647704681 for the model trained with rank = 12, lambda = 1.0, and numIter = 20. [2025-10-22T14:28:17.106Z] RMSE (validation) = 0.9919630846870685 for the model trained with rank = 8, lambda = 0.05, and numIter = 20. [2025-10-22T14:28:23.002Z] RMSE (validation) = 1.2070175349451324 for the model trained with rank = 10, lambda = 0.01, and numIter = 10. [2025-10-22T14:28:30.261Z] RMSE (validation) = 1.114680167277025 for the model trained with rank = 8, lambda = 0.02, and numIter = 10. [2025-10-22T14:28:37.517Z] RMSE (validation) = 0.922741950338674 for the model trained with rank = 12, lambda = 0.1, and numIter = 10. [2025-10-22T14:28:44.766Z] RMSE (validation) = 0.898064398059034 for the model trained with rank = 8, lambda = 0.2, and numIter = 10. [2025-10-22T14:28:44.766Z] The best model was trained with rank = 8 and lambda = 0.2, and numIter = 10, and its RMSE on the test set is 0.9063252168319611. [2025-10-22T14:28:45.096Z] The best model improves the baseline by 14.52%. [2025-10-22T14:28:46.244Z] Top recommended movies for user id 72: [2025-10-22T14:28:46.244Z] 1: Land of Silence and Darkness (Land des Schweigens und der Dunkelheit) (1971) (rating: 4.659, id: 67504) [2025-10-22T14:28:46.244Z] 2: Goat, The (1921) (rating: 4.659, id: 83318) [2025-10-22T14:28:46.244Z] 3: Play House, The (1921) (rating: 4.659, id: 83359) [2025-10-22T14:28:46.244Z] 4: Cops (1922) (rating: 4.659, id: 83411) [2025-10-22T14:28:46.244Z] 5: Dear Frankie (2004) (rating: 4.267, id: 8530) [2025-10-22T14:28:46.586Z] ====== movie-lens (apache-spark) [default], iteration 0 completed (101823.107 ms) ====== [2025-10-22T14:28:46.586Z] ====== movie-lens (apache-spark) [default], iteration 1 started ====== [2025-10-22T14:28:47.307Z] GC before operation: completed in 890.919 ms, heap usage 213.977 MB -> 91.731 MB. [2025-10-22T14:29:00.414Z] RMSE (validation) = 3.621968954548761 for the model trained with rank = 8, lambda = 5.0, and numIter = 20. [2025-10-22T14:29:11.264Z] RMSE (validation) = 2.1340923218923 for the model trained with rank = 10, lambda = 2.0, and numIter = 20. [2025-10-22T14:29:20.131Z] RMSE (validation) = 1.310518647704681 for the model trained with rank = 12, lambda = 1.0, and numIter = 20. [2025-10-22T14:29:30.933Z] RMSE (validation) = 0.9919630846870686 for the model trained with rank = 8, lambda = 0.05, and numIter = 20. [2025-10-22T14:29:36.808Z] RMSE (validation) = 1.2070175349451324 for the model trained with rank = 10, lambda = 0.01, and numIter = 10. [2025-10-22T14:29:42.696Z] RMSE (validation) = 1.114680167277025 for the model trained with rank = 8, lambda = 0.02, and numIter = 10. [2025-10-22T14:29:48.597Z] RMSE (validation) = 0.922741950338674 for the model trained with rank = 12, lambda = 0.1, and numIter = 10. [2025-10-22T14:29:54.470Z] RMSE (validation) = 0.898064398059034 for the model trained with rank = 8, lambda = 0.2, and numIter = 10. [2025-10-22T14:29:55.067Z] The best model was trained with rank = 8 and lambda = 0.2, and numIter = 10, and its RMSE on the test set is 0.9063252168319611. [2025-10-22T14:29:55.067Z] The best model improves the baseline by 14.52%. [2025-10-22T14:29:56.215Z] Top recommended movies for user id 72: [2025-10-22T14:29:56.215Z] 1: Land of Silence and Darkness (Land des Schweigens und der Dunkelheit) (1971) (rating: 4.659, id: 67504) [2025-10-22T14:29:56.215Z] 2: Goat, The (1921) (rating: 4.659, id: 83318) [2025-10-22T14:29:56.215Z] 3: Play House, The (1921) (rating: 4.659, id: 83359) [2025-10-22T14:29:56.215Z] 4: Cops (1922) (rating: 4.659, id: 83411) [2025-10-22T14:29:56.215Z] 5: Dear Frankie (2004) (rating: 4.267, id: 8530) [2025-10-22T14:29:56.215Z] ====== movie-lens (apache-spark) [default], iteration 1 completed (68676.934 ms) ====== [2025-10-22T14:29:56.215Z] ====== movie-lens (apache-spark) [default], iteration 2 started ====== [2025-10-22T14:29:56.940Z] GC before operation: completed in 877.985 ms, heap usage 518.800 MB -> 90.252 MB. [2025-10-22T14:30:07.784Z] RMSE (validation) = 3.621968954548761 for the model trained with rank = 8, lambda = 5.0, and numIter = 20. [2025-10-22T14:30:18.693Z] RMSE (validation) = 2.1340923218923 for the model trained with rank = 10, lambda = 2.0, and numIter = 20. [2025-10-22T14:30:27.548Z] RMSE (validation) = 1.310518647704681 for the model trained with rank = 12, lambda = 1.0, and numIter = 20. [2025-10-22T14:30:36.415Z] RMSE (validation) = 0.9919630846870686 for the model trained with rank = 8, lambda = 0.05, and numIter = 20. [2025-10-22T14:30:42.292Z] RMSE (validation) = 1.2070175349451324 for the model trained with rank = 10, lambda = 0.01, and numIter = 10. [2025-10-22T14:30:48.181Z] RMSE (validation) = 1.114680167277025 for the model trained with rank = 8, lambda = 0.02, and numIter = 10. [2025-10-22T14:30:55.429Z] RMSE (validation) = 0.922741950338674 for the model trained with rank = 12, lambda = 0.1, and numIter = 10. [2025-10-22T14:31:00.156Z] RMSE (validation) = 0.898064398059034 for the model trained with rank = 8, lambda = 0.2, and numIter = 10. [2025-10-22T14:31:00.855Z] The best model was trained with rank = 8 and lambda = 0.2, and numIter = 10, and its RMSE on the test set is 0.9063252168319611. [2025-10-22T14:31:01.193Z] The best model improves the baseline by 14.52%. [2025-10-22T14:31:01.917Z] Top recommended movies for user id 72: [2025-10-22T14:31:01.917Z] 1: Land of Silence and Darkness (Land des Schweigens und der Dunkelheit) (1971) (rating: 4.659, id: 67504) [2025-10-22T14:31:01.917Z] 2: Goat, The (1921) (rating: 4.659, id: 83318) [2025-10-22T14:31:01.917Z] 3: Play House, The (1921) (rating: 4.659, id: 83359) [2025-10-22T14:31:01.917Z] 4: Cops (1922) (rating: 4.659, id: 83411) [2025-10-22T14:31:01.917Z] 5: Dear Frankie (2004) (rating: 4.267, id: 8530) [2025-10-22T14:31:01.917Z] ====== movie-lens (apache-spark) [default], iteration 2 completed (65054.431 ms) ====== [2025-10-22T14:31:01.917Z] ====== movie-lens (apache-spark) [default], iteration 3 started ====== [2025-10-22T14:31:02.800Z] GC before operation: completed in 900.904 ms, heap usage 504.887 MB -> 90.838 MB. [2025-10-22T14:31:07.561Z] 14:31:06.597 ERROR [Executor task launch worker for task 2.0 in stage 4701.0 (TID 4605)] org.apache.spark.executor.Executor - Exception in task 2.0 in stage 4701.0 (TID 4605) [2025-10-22T14:31:07.561Z] java.lang.IllegalMonitorStateException: attempt to unlock read lock, not locked by current thread [2025-10-22T14:31:07.561Z] at java.util.concurrent.locks.ReentrantReadWriteLock$Sync.unmatchedUnlockException(ReentrantReadWriteLock.java:448) ~[?:?] [2025-10-22T14:31:07.561Z] at java.util.concurrent.locks.ReentrantReadWriteLock$Sync.tryReleaseShared(ReentrantReadWriteLock.java:432) ~[?:?] [2025-10-22T14:31:07.561Z] at java.util.concurrent.locks.AbstractQueuedSynchronizer.releaseShared(AbstractQueuedSynchronizer.java:1094) ~[?:?] [2025-10-22T14:31:07.561Z] at java.util.concurrent.locks.ReentrantReadWriteLock$ReadLock.unlock(ReentrantReadWriteLock.java:897) ~[?:?] [2025-10-22T14:31:07.561Z] at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLoggersInContext(AbstractLoggerAdapter.java:74) ~[log4j-api-2.20.0.jar:2.20.0] [2025-10-22T14:31:07.561Z] at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:48) ~[log4j-api-2.20.0.jar:2.20.0] [2025-10-22T14:31:07.561Z] at org.apache.logging.slf4j.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:33) ~[log4j-slf4j2-impl-2.20.0.jar:2.20.0] [2025-10-22T14:31:07.561Z] at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:432) ~[slf4j-api-2.0.16.jar:2.0.16] [2025-10-22T14:31:07.561Z] at org.apache.spark.internal.Logging.log(Logging.scala:53) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.561Z] at org.apache.spark.internal.Logging.log$(Logging.scala:50) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.561Z] at org.apache.spark.storage.ShuffleBlockFetcherIterator.log(ShuffleBlockFetcherIterator.scala:86) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.561Z] at org.apache.spark.internal.Logging.logDebug(Logging.scala:64) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.561Z] at org.apache.spark.internal.Logging.logDebug$(Logging.scala:63) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.561Z] at org.apache.spark.storage.ShuffleBlockFetcherIterator.logDebug(ShuffleBlockFetcherIterator.scala:86) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.storage.ShuffleBlockFetcherIterator.partitionBlocksByFetchMode(ShuffleBlockFetcherIterator.scala:389) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.storage.ShuffleBlockFetcherIterator.initialize(ShuffleBlockFetcherIterator.scala:707) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.storage.ShuffleBlockFetcherIterator.<init>(ShuffleBlockFetcherIterator.scala:194) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.shuffle.BlockStoreShuffleReader.read(BlockStoreShuffleReader.scala:73) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.rdd.ShuffledRDD.compute(ShuffledRDD.scala:106) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.rdd.CoGroupedRDD.$anonfun$compute$2(CoGroupedRDD.scala:140) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:619) ~[scala-library-2.13.15.jar:?] [2025-10-22T14:31:07.562Z] at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:617) ~[scala-library-2.13.15.jar:?] [2025-10-22T14:31:07.562Z] at scala.collection.AbstractIterable.foreach(Iterable.scala:935) ~[scala-library-2.13.15.jar:?] [2025-10-22T14:31:07.562Z] at scala.collection.IterableOps$WithFilter.foreach(Iterable.scala:905) ~[scala-library-2.13.15.jar:?] [2025-10-22T14:31:07.562Z] at org.apache.spark.rdd.CoGroupedRDD.compute(CoGroupedRDD.scala:136) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.rdd.CoGroupedRDD.$anonfun$compute$2(CoGroupedRDD.scala:140) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:619) ~[scala-library-2.13.15.jar:?] [2025-10-22T14:31:07.562Z] at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:617) ~[scala-library-2.13.15.jar:?] [2025-10-22T14:31:07.562Z] at scala.collection.AbstractIterable.foreach(Iterable.scala:935) ~[scala-library-2.13.15.jar:?] [2025-10-22T14:31:07.562Z] at scala.collection.IterableOps$WithFilter.foreach(Iterable.scala:905) ~[scala-library-2.13.15.jar:?] [2025-10-22T14:31:07.562Z] at org.apache.spark.rdd.CoGroupedRDD.compute(CoGroupedRDD.scala:136) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:104) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:54) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:166) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.scheduler.Task.run(Task.scala:141) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94) ~[spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623) [spark-core_2.13-3.5.3.jar:3.5.3] [2025-10-22T14:31:07.562Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?] [2025-10-22T14:31:07.562Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?] [2025-10-22T14:31:07.562Z] at java.lang.Thread.run(Thread.java:840) [?:?] [2025-10-22T14:31:07.562Z] 14:31:06.734 WARN [task-result-getter-1] org.apache.spark.scheduler.TaskSetManager - Lost task 2.0 in stage 4701.0 (TID 4605) (test-rise-ubuntu2404-riscv64-1.adoptopenjdk.net executor driver): java.lang.IllegalMonitorStateException: attempt to unlock read lock, not locked by current thread [2025-10-22T14:31:07.562Z] at java.base/java.util.concurrent.locks.ReentrantReadWriteLock$Sync.unmatchedUnlockException(ReentrantReadWriteLock.java:448) [2025-10-22T14:31:07.562Z] at java.base/java.util.concurrent.locks.ReentrantReadWriteLock$Sync.tryReleaseShared(ReentrantReadWriteLock.java:432) [2025-10-22T14:31:07.562Z] at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer.releaseShared(AbstractQueuedSynchronizer.java:1094) [2025-10-22T14:31:07.562Z] at java.base/java.util.concurrent.locks.ReentrantReadWriteLock$ReadLock.unlock(ReentrantReadWriteLock.java:897) [2025-10-22T14:31:07.562Z] at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLoggersInContext(AbstractLoggerAdapter.java:74) [2025-10-22T14:31:07.562Z] at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:48) [2025-10-22T14:31:07.562Z] at org.apache.logging.slf4j.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:33) [2025-10-22T14:31:07.562Z] at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:432) [2025-10-22T14:31:07.562Z] at org.apache.spark.internal.Logging.log(Logging.scala:53) [2025-10-22T14:31:07.562Z] at org.apache.spark.internal.Logging.log$(Logging.scala:50) [2025-10-22T14:31:07.562Z] at org.apache.spark.storage.ShuffleBlockFetcherIterator.log(ShuffleBlockFetcherIterator.scala:86) [2025-10-22T14:31:07.562Z] at org.apache.spark.internal.Logging.logDebug(Logging.scala:64) [2025-10-22T14:31:07.562Z] at org.apache.spark.internal.Logging.logDebug$(Logging.scala:63) [2025-10-22T14:31:07.562Z] at org.apache.spark.storage.ShuffleBlockFetcherIterator.logDebug(ShuffleBlockFetcherIterator.scala:86) [2025-10-22T14:31:07.562Z] at org.apache.spark.storage.ShuffleBlockFetcherIterator.partitionBlocksByFetchMode(ShuffleBlockFetcherIterator.scala:389) [2025-10-22T14:31:07.562Z] at org.apache.spark.storage.ShuffleBlockFetcherIterator.initialize(ShuffleBlockFetcherIterator.scala:707) [2025-10-22T14:31:07.562Z] at org.apache.spark.storage.ShuffleBlockFetcherIterator.<init>(ShuffleBlockFetcherIterator.scala:194) [2025-10-22T14:31:07.562Z] at org.apache.spark.shuffle.BlockStoreShuffleReader.read(BlockStoreShuffleReader.scala:73) [2025-10-22T14:31:07.562Z] at org.apache.spark.rdd.ShuffledRDD.compute(ShuffledRDD.scala:106) [2025-10-22T14:31:07.562Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2025-10-22T14:31:07.562Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2025-10-22T14:31:07.562Z] at org.apache.spark.rdd.CoGroupedRDD.$anonfun$compute$2(CoGroupedRDD.scala:140) [2025-10-22T14:31:07.562Z] at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:619) [2025-10-22T14:31:07.562Z] at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:617) [2025-10-22T14:31:07.562Z] at scala.collection.AbstractIterable.foreach(Iterable.scala:935) [2025-10-22T14:31:07.562Z] at scala.collection.IterableOps$WithFilter.foreach(Iterable.scala:905) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.CoGroupedRDD.compute(CoGroupedRDD.scala:136) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.CoGroupedRDD.$anonfun$compute$2(CoGroupedRDD.scala:140) [2025-10-22T14:31:07.563Z] at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:619) [2025-10-22T14:31:07.563Z] at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:617) [2025-10-22T14:31:07.563Z] at scala.collection.AbstractIterable.foreach(Iterable.scala:935) [2025-10-22T14:31:07.563Z] at scala.collection.IterableOps$WithFilter.foreach(Iterable.scala:905) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.CoGroupedRDD.compute(CoGroupedRDD.scala:136) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2025-10-22T14:31:07.563Z] at org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59) [2025-10-22T14:31:07.563Z] at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:104) [2025-10-22T14:31:07.563Z] at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:54) [2025-10-22T14:31:07.563Z] at org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:166) [2025-10-22T14:31:07.563Z] at org.apache.spark.scheduler.Task.run(Task.scala:141) [2025-10-22T14:31:07.563Z] at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620) [2025-10-22T14:31:07.563Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64) [2025-10-22T14:31:07.563Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61) [2025-10-22T14:31:07.563Z] at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94) [2025-10-22T14:31:07.563Z] at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623) [2025-10-22T14:31:07.563Z] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [2025-10-22T14:31:07.563Z] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [2025-10-22T14:31:07.563Z] at java.base/java.lang.Thread.run(Thread.java:840) [2025-10-22T14:31:07.563Z] [2025-10-22T14:31:07.563Z] 14:31:06.741 ERROR [task-result-getter-1] org.apache.spark.scheduler.TaskSetManager - Task 2 in stage 4701.0 failed 1 times; aborting job [2025-10-22T14:31:07.563Z] ====== movie-lens (apache-spark) [default], iteration 3 failed (SparkException) ====== [2025-10-22T14:31:07.563Z] Benchmark 'movie-lens' failed with exception: [2025-10-22T14:31:07.563Z] org.apache.spark.SparkException: Job aborted due to stage failure: Task 2 in stage 4701.0 failed 1 times, most recent failure: Lost task 2.0 in stage 4701.0 (TID 4605) (test-rise-ubuntu2404-riscv64-1.adoptopenjdk.net executor driver): java.lang.IllegalMonitorStateException: attempt to unlock read lock, not locked by current thread [2025-10-22T14:31:07.563Z] at java.base/java.util.concurrent.locks.ReentrantReadWriteLock$Sync.unmatchedUnlockException(ReentrantReadWriteLock.java:448) [2025-10-22T14:31:07.563Z] at java.base/java.util.concurrent.locks.ReentrantReadWriteLock$Sync.tryReleaseShared(ReentrantReadWriteLock.java:432) [2025-10-22T14:31:07.563Z] at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer.releaseShared(AbstractQueuedSynchronizer.java:1094) [2025-10-22T14:31:07.563Z] at java.base/java.util.concurrent.locks.ReentrantReadWriteLock$ReadLock.unlock(ReentrantReadWriteLock.java:897) [2025-10-22T14:31:07.563Z] at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLoggersInContext(AbstractLoggerAdapter.java:74) [2025-10-22T14:31:07.563Z] at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:48) [2025-10-22T14:31:07.563Z] at org.apache.logging.slf4j.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:33) [2025-10-22T14:31:07.563Z] at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:432) [2025-10-22T14:31:07.563Z] at org.apache.spark.internal.Logging.log(Logging.scala:53) [2025-10-22T14:31:07.563Z] at org.apache.spark.internal.Logging.log$(Logging.scala:50) [2025-10-22T14:31:07.563Z] at org.apache.spark.storage.ShuffleBlockFetcherIterator.log(ShuffleBlockFetcherIterator.scala:86) [2025-10-22T14:31:07.563Z] at org.apache.spark.internal.Logging.logDebug(Logging.scala:64) [2025-10-22T14:31:07.563Z] at org.apache.spark.internal.Logging.logDebug$(Logging.scala:63) [2025-10-22T14:31:07.563Z] at org.apache.spark.storage.ShuffleBlockFetcherIterator.logDebug(ShuffleBlockFetcherIterator.scala:86) [2025-10-22T14:31:07.563Z] at org.apache.spark.storage.ShuffleBlockFetcherIterator.partitionBlocksByFetchMode(ShuffleBlockFetcherIterator.scala:389) [2025-10-22T14:31:07.563Z] at org.apache.spark.storage.ShuffleBlockFetcherIterator.initialize(ShuffleBlockFetcherIterator.scala:707) [2025-10-22T14:31:07.563Z] at org.apache.spark.storage.ShuffleBlockFetcherIterator.<init>(ShuffleBlockFetcherIterator.scala:194) [2025-10-22T14:31:07.563Z] at org.apache.spark.shuffle.BlockStoreShuffleReader.read(BlockStoreShuffleReader.scala:73) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.ShuffledRDD.compute(ShuffledRDD.scala:106) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.CoGroupedRDD.$anonfun$compute$2(CoGroupedRDD.scala:140) [2025-10-22T14:31:07.563Z] at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:619) [2025-10-22T14:31:07.563Z] at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:617) [2025-10-22T14:31:07.563Z] at scala.collection.AbstractIterable.foreach(Iterable.scala:935) [2025-10-22T14:31:07.563Z] at scala.collection.IterableOps$WithFilter.foreach(Iterable.scala:905) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.CoGroupedRDD.compute(CoGroupedRDD.scala:136) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.CoGroupedRDD.$anonfun$compute$2(CoGroupedRDD.scala:140) [2025-10-22T14:31:07.563Z] at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:619) [2025-10-22T14:31:07.563Z] at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:617) [2025-10-22T14:31:07.563Z] at scala.collection.AbstractIterable.foreach(Iterable.scala:935) [2025-10-22T14:31:07.563Z] at scala.collection.IterableOps$WithFilter.foreach(Iterable.scala:905) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.CoGroupedRDD.compute(CoGroupedRDD.scala:136) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2025-10-22T14:31:07.563Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2025-10-22T14:31:07.564Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2025-10-22T14:31:07.564Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2025-10-22T14:31:07.564Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2025-10-22T14:31:07.564Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2025-10-22T14:31:07.564Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2025-10-22T14:31:07.564Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2025-10-22T14:31:07.564Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2025-10-22T14:31:07.564Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2025-10-22T14:31:07.564Z] at org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59) [2025-10-22T14:31:07.564Z] at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:104) [2025-10-22T14:31:07.564Z] at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:54) [2025-10-22T14:31:07.564Z] at org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:166) [2025-10-22T14:31:07.564Z] at org.apache.spark.scheduler.Task.run(Task.scala:141) [2025-10-22T14:31:07.564Z] at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620) [2025-10-22T14:31:07.564Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64) [2025-10-22T14:31:07.564Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61) [2025-10-22T14:31:07.564Z] at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94) [2025-10-22T14:31:07.564Z] at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623) [2025-10-22T14:31:07.564Z] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [2025-10-22T14:31:07.564Z] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [2025-10-22T14:31:07.564Z] at java.base/java.lang.Thread.run(Thread.java:840) [2025-10-22T14:31:07.564Z] [2025-10-22T14:31:07.564Z] Driver stacktrace: [2025-10-22T14:31:07.564Z] at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2856) [2025-10-22T14:31:07.564Z] at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2792) [2025-10-22T14:31:07.564Z] at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2791) [2025-10-22T14:31:07.564Z] at scala.collection.immutable.List.foreach(List.scala:334) [2025-10-22T14:31:07.564Z] at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2791) [2025-10-22T14:31:07.564Z] at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1247) [2025-10-22T14:31:07.564Z] at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1247) [2025-10-22T14:31:07.564Z] at scala.Option.foreach(Option.scala:437) [2025-10-22T14:31:07.564Z] at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1247) [2025-10-22T14:31:07.564Z] at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:3060) [2025-10-22T14:31:07.564Z] at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2994) [2025-10-22T14:31:07.564Z] at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2983) [2025-10-22T14:31:07.564Z] at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49) [2025-10-22T14:31:07.564Z] at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:989) [2025-10-22T14:31:07.564Z] at org.apache.spark.SparkContext.runJob(SparkContext.scala:2393) [2025-10-22T14:31:07.564Z] at org.apache.spark.SparkContext.runJob(SparkContext.scala:2414) [2025-10-22T14:31:07.564Z] at org.apache.spark.SparkContext.runJob(SparkContext.scala:2433) [2025-10-22T14:31:07.564Z] at org.apache.spark.SparkContext.runJob(SparkContext.scala:2458) [2025-10-22T14:31:07.564Z] at org.apache.spark.rdd.RDD.count(RDD.scala:1296) [2025-10-22T14:31:07.564Z] at org.apache.spark.ml.recommendation.ALS$.$anonfun$train$9(ALS.scala:1055) [2025-10-22T14:31:07.564Z] at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:190) [2025-10-22T14:31:07.564Z] at org.apache.spark.ml.recommendation.ALS$.train(ALS.scala:1049) [2025-10-22T14:31:07.564Z] at org.apache.spark.mllib.recommendation.ALS.run(ALS.scala:256) [2025-10-22T14:31:07.564Z] at org.renaissance.apache.spark.MovieLens$MovieLensHelper.trainModel(MovieLens.scala:264) [2025-10-22T14:31:07.564Z] at org.renaissance.apache.spark.MovieLens$MovieLensHelper.$anonfun$trainModels$1(MovieLens.scala:240) [2025-10-22T14:31:07.564Z] at org.renaissance.apache.spark.MovieLens$MovieLensHelper.$anonfun$trainModels$1$adapted(MovieLens.scala:239) [2025-10-22T14:31:07.564Z] at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:619) [2025-10-22T14:31:07.564Z] at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:617) [2025-10-22T14:31:07.564Z] at scala.collection.AbstractIterable.foreach(Iterable.scala:935) [2025-10-22T14:31:07.564Z] at org.renaissance.apache.spark.MovieLens$MovieLensHelper.trainModels(MovieLens.scala:239) [2025-10-22T14:31:07.564Z] at org.renaissance.apache.spark.MovieLens.run(MovieLens.scala:348) [2025-10-22T14:31:07.564Z] at org.renaissance.harness.ExecutionDriver.executeOperation(ExecutionDriver.java:137) [2025-10-22T14:31:07.564Z] at org.renaissance.harness.ExecutionDriver.executeBenchmark(ExecutionDriver.java:93) [2025-10-22T14:31:07.564Z] at org.renaissance.harness.RenaissanceSuite$.runBenchmarks$$anonfun$1(RenaissanceSuite.scala:172) [2025-10-22T14:31:07.564Z] at scala.runtime.function.JProcedure1.apply(JProcedure1.java:15) [2025-10-22T14:31:07.564Z] at scala.runtime.function.JProcedure1.apply(JProcedure1.java:10) [2025-10-22T14:31:07.564Z] at scala.collection.immutable.List.foreach(List.scala:334) [2025-10-22T14:31:07.564Z] at org.renaissance.harness.RenaissanceSuite$.runBenchmarks(RenaissanceSuite.scala:161) [2025-10-22T14:31:07.564Z] at org.renaissance.harness.RenaissanceSuite$.main(RenaissanceSuite.scala:130) [2025-10-22T14:31:07.564Z] at org.renaissance.harness.RenaissanceSuite.main(RenaissanceSuite.scala) [2025-10-22T14:31:07.564Z] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) [2025-10-22T14:31:07.564Z] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) [2025-10-22T14:31:07.564Z] at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) [2025-10-22T14:31:07.564Z] at java.base/java.lang.reflect.Method.invoke(Method.java:569) [2025-10-22T14:31:07.564Z] at org.renaissance.core.Launcher.loadAndInvokeHarnessClass(Launcher.java:129) [2025-10-22T14:31:07.564Z] at org.renaissance.core.Launcher.launchHarnessClass(Launcher.java:78) [2025-10-22T14:31:07.564Z] at org.renaissance.core.Launcher.main(Launcher.java:43) [2025-10-22T14:31:07.564Z] Caused by: java.lang.IllegalMonitorStateException: attempt to unlock read lock, not locked by current thread [2025-10-22T14:31:07.564Z] at java.base/java.util.concurrent.locks.ReentrantReadWriteLock$Sync.unmatchedUnlockException(ReentrantReadWriteLock.java:448) [2025-10-22T14:31:07.564Z] at java.base/java.util.concurrent.locks.ReentrantReadWriteLock$Sync.tryReleaseShared(ReentrantReadWriteLock.java:432) [2025-10-22T14:31:07.564Z] at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer.releaseShared(AbstractQueuedSynchronizer.java:1094) [2025-10-22T14:31:07.564Z] at java.base/java.util.concurrent.locks.ReentrantReadWriteLock$ReadLock.unlock(ReentrantReadWriteLock.java:897) [2025-10-22T14:31:07.564Z] at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLoggersInContext(AbstractLoggerAdapter.java:74) [2025-10-22T14:31:07.564Z] at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:48) [2025-10-22T14:31:07.564Z] at org.apache.logging.slf4j.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:33) [2025-10-22T14:31:07.564Z] at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:432) [2025-10-22T14:31:07.564Z] at org.apache.spark.internal.Logging.log(Logging.scala:53) [2025-10-22T14:31:07.564Z] at org.apache.spark.internal.Logging.log$(Logging.scala:50) [2025-10-22T14:31:07.564Z] at org.apache.spark.storage.ShuffleBlockFetcherIterator.log(ShuffleBlockFetcherIterator.scala:86) [2025-10-22T14:31:07.564Z] at org.apache.spark.internal.Logging.logDebug(Logging.scala:64) [2025-10-22T14:31:07.564Z] at org.apache.spark.internal.Logging.logDebug$(Logging.scala:63) [2025-10-22T14:31:07.564Z] at org.apache.spark.storage.ShuffleBlockFetcherIterator.logDebug(ShuffleBlockFetcherIterator.scala:86) [2025-10-22T14:31:07.564Z] at org.apache.spark.storage.ShuffleBlockFetcherIterator.partitionBlocksByFetchMode(ShuffleBlockFetcherIterator.scala:389) [2025-10-22T14:31:07.564Z] at org.apache.spark.storage.ShuffleBlockFetcherIterator.initialize(ShuffleBlockFetcherIterator.scala:707) [2025-10-22T14:31:07.564Z] at org.apache.spark.storage.ShuffleBlockFetcherIterator.<init>(ShuffleBlockFetcherIterator.scala:194) [2025-10-22T14:31:07.564Z] at org.apache.spark.shuffle.BlockStoreShuffleReader.read(BlockStoreShuffleReader.scala:73) [2025-10-22T14:31:07.565Z] at org.apache.spark.rdd.ShuffledRDD.compute(ShuffledRDD.scala:106) [2025-10-22T14:31:07.565Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2025-10-22T14:31:07.565Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2025-10-22T14:31:07.565Z] at org.apache.spark.rdd.CoGroupedRDD.$anonfun$compute$2(CoGroupedRDD.scala:140) [2025-10-22T14:31:07.565Z] at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:619) [2025-10-22T14:31:07.565Z] at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:617) [2025-10-22T14:31:07.565Z] at scala.collection.AbstractIterable.foreach(Iterable.scala:935) [2025-10-22T14:31:07.565Z] at scala.collection.IterableOps$WithFilter.foreach(Iterable.scala:905) [2025-10-22T14:31:07.565Z] at org.apache.spark.rdd.CoGroupedRDD.compute(CoGroupedRDD.scala:136) [2025-10-22T14:31:07.565Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2025-10-22T14:31:07.565Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2025-10-22T14:31:07.565Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2025-10-22T14:31:07.565Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2025-10-22T14:31:07.565Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2025-10-22T14:31:07.565Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2025-10-22T14:31:07.565Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2025-10-22T14:31:07.565Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2025-10-22T14:31:07.565Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2025-10-22T14:31:07.565Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2025-10-22T14:31:07.565Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2025-10-22T14:31:07.565Z] at org.apache.spark.rdd.CoGroupedRDD.$anonfun$compute$2(CoGroupedRDD.scala:140) [2025-10-22T14:31:07.565Z] at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:619) [2025-10-22T14:31:07.565Z] at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:617) [2025-10-22T14:31:07.565Z] at scala.collection.AbstractIterable.foreach(Iterable.scala:935) [2025-10-22T14:31:07.565Z] at scala.collection.IterableOps$WithFilter.foreach(Iterable.scala:905) [2025-10-22T14:31:07.565Z] at org.apache.spark.rdd.CoGroupedRDD.compute(CoGroupedRDD.scala:136) [2025-10-22T14:31:07.565Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2025-10-22T14:31:07.565Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2025-10-22T14:31:07.565Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2025-10-22T14:31:07.565Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2025-10-22T14:31:07.565Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2025-10-22T14:31:07.565Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2025-10-22T14:31:07.565Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2025-10-22T14:31:07.565Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2025-10-22T14:31:07.565Z] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) [2025-10-22T14:31:07.565Z] at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367) [2025-10-22T14:31:07.565Z] at org.apache.spark.rdd.RDD.iterator(RDD.scala:331) [2025-10-22T14:31:07.565Z] at org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59) [2025-10-22T14:31:07.565Z] at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:104) [2025-10-22T14:31:07.565Z] at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:54) [2025-10-22T14:31:07.565Z] at org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:166) [2025-10-22T14:31:07.565Z] at org.apache.spark.scheduler.Task.run(Task.scala:141) [2025-10-22T14:31:07.565Z] at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620) [2025-10-22T14:31:07.565Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64) [2025-10-22T14:31:07.565Z] at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61) [2025-10-22T14:31:07.565Z] at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94) [2025-10-22T14:31:07.565Z] at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623) [2025-10-22T14:31:07.565Z] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [2025-10-22T14:31:07.565Z] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [2025-10-22T14:31:07.565Z] at java.base/java.lang.Thread.run(Thread.java:840) [2025-10-22T14:31:07.891Z] The following benchmarks failed: movie-lens [2025-10-22T14:31:08.593Z] ----------------------------------- [2025-10-22T14:31:08.593Z] renaissance-movie-lens_0_FAILED [2025-10-22T14:31:08.593Z] ----------------------------------- [2025-10-22T14:31:08.593Z] [2025-10-22T14:31:08.593Z] TEST TEARDOWN: [2025-10-22T14:31:08.593Z] Nothing to be done for teardown. [2025-10-22T14:31:08.593Z] renaissance-movie-lens_0 Finish Time: Wed Oct 22 14:31:08 2025 Epoch Time (ms): 1761143468416