{"payload":{"feedbackUrl":"https://github.com/orgs/community/discussions/53140","repo":{"id":263807922,"defaultBranch":"master","name":"spark","ownerLogin":"mridulm","currentUserCanPush":false,"isFork":true,"isEmpty":false,"createdAt":"2020-05-14T03:43:03.000Z","ownerAvatar":"https://avatars.githubusercontent.com/u/1591700?v=4","public":true,"private":false,"isOrgOwned":false},"refInfo":{"name":"","listCacheKey":"v0:1708847630.0","currentOid":""},"activityList":{"items":[{"before":"89ca8b6065e9f690a492c778262080741d50d94d","after":"18b86068ff4c72ba686d3d9275f9284d58cd3ef4","ref":"refs/heads/master","pushedAt":"2024-02-25T07:54:35.000Z","pushType":"push","commitsCount":1344,"pusher":{"login":"mridulm","name":"Mridul Muralidharan","path":"/mridulm","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/1591700?s=80&v=4"},"commit":{"message":"[SPARK-47154][SS][TESTS] Fix `kafka-0-10-sql` to use `ResetSystemProperties` if `KafkaTestUtils` is used\n\n### What changes were proposed in this pull request?\n\nThis PR aims to fix `kafka-0-10-sql` module to use `ResetSystemProperties` if `KafkaTestUtils` is used. The following test suites are fixed.\n\n- ConsumerStrategySuite\n- KafkaDataConsumerSuite\n- KafkaMissingOffsetsTest\n - KafkaDontFailOnDataLossSuite\n - KafkaSourceStressForDontFailOnDataLossSuite\n- KafkaTest\n - KafkaDelegationTokenSuite\n - KafkaMicroBatchSourceSuite\n - KafkaMicroBatchV1SourceWithAdminSuite\n - KafkaMicroBatchV2SourceWithAdminSuite\n - KafkaMicroBatchV1SourceSuite\n - KafkaMicroBatchV2SourceSuite\n - KafkaSourceStressSuite\n - KafkaOffsetReaderSuite\n - KafkaRelationSuite\n - KafkaRelationSuiteWithAdminV1\n - KafkaRelationSuiteWithAdminV2\n - KafkaRelationSuiteV1\n - KafkaRelationSuiteV2\n - KafkaSinkSuite\n - KafkaSinkMicroBatchStreamingSuite\n - KafkaContinuousSinkSuite\n - KafkaSinkBatchSuiteV1\n - KafkaSinkBatchSuiteV2\n\n### Why are the changes needed?\n\nApache Spark `master` branch has two `KafkaTestUtils` classes.\n\n```\n$ find . -name KafkaTestUtils.scala\n./connector/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaTestUtils.scala\n./connector/kafka-0-10/src/test/scala/org/apache/spark/streaming/kafka010/KafkaTestUtils.scala\n```\n\n`KafkaTestUtils` of `kafka-0-10-sql` uses `System.setProperty` and affects 8 files. We need to use `ResetSystemProperties` to isolate the test cases.\n\nhttps://github.com/apache/spark/blob/ee312ecb40ea5b5303fc794a3d494b6f27cda923/connector/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaTestUtils.scala#L290\n\n```\n$ git grep KafkaTestUtils connector/kafka-0-10-sql | awk -F: '{print $1}' | sort | uniq\nconnector/kafka-0-10-sql/src/test/resources/log4j2.properties\nconnector/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/ConsumerStrategySuite.scala\nconnector/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaDelegationTokenSuite.scala\nconnector/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaDontFailOnDataLossSuite.scala\nconnector/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaMicroBatchSourceSuite.scala\nconnector/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaOffsetReaderSuite.scala\nconnector/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaRelationSuite.scala\nconnector/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaSinkSuite.scala\nconnector/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaTestUtils.scala\nconnector/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/consumer/KafkaDataConsumerSuite.scala\n```\n\n### Does this PR introduce _any_ user-facing change?\n\nNo. This is a test-only PR.\n\n### How was this patch tested?\n\nPass the CIs.\n\n### Was this patch authored or co-authored using generative AI tooling?\n\nNo.\n\nCloses #45239 from dongjoon-hyun/SPARK-47154.\n\nAuthored-by: Dongjoon Hyun \nSigned-off-by: Dongjoon Hyun ","shortMessageHtmlLink":"[SPARK-47154][SS][TESTS] Fix kafka-0-10-sql to use `ResetSystemProp…"}},{"before":null,"after":"2d73adcfa4606282cc9bf1e022fc15afa7aebd8f","ref":"refs/heads/45052-suggestion-for-sunchao","pushedAt":"2024-02-25T07:53:50.000Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"mridulm","name":"Mridul Muralidharan","path":"/mridulm","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/1591700?s=80&v=4"},"commit":{"message":"Modification to PR 45052, to illustrate and fix the issue being discussed w.r.t NPE","shortMessageHtmlLink":"Modification to PR 45052, to illustrate and fix the issue being discu…"}},{"before":"585b493bcdc13e84be45f3013c3a13fd3628ef63","after":"d98cefd59f431f5da89afe57ff49fcfb1651498d","ref":"refs/heads/SPARK-45762-alternative-proposal","pushedAt":"2023-11-08T08:14:59.000Z","pushType":"push","commitsCount":1,"pusher":{"login":"mridulm","name":"Mridul Muralidharan","path":"/mridulm","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/1591700?s=80&v=4"},"commit":{"message":"Remove unnecessary variable","shortMessageHtmlLink":"Remove unnecessary variable"}},{"before":"d5ba0cca4eaa2cd4bcd5abea2564235e022507df","after":"585b493bcdc13e84be45f3013c3a13fd3628ef63","ref":"refs/heads/SPARK-45762-alternative-proposal","pushedAt":"2023-11-08T08:06:45.000Z","pushType":"push","commitsCount":1,"pusher":{"login":"mridulm","name":"Mridul Muralidharan","path":"/mridulm","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/1591700?s=80&v=4"},"commit":{"message":"Fix test failures","shortMessageHtmlLink":"Fix test failures"}},{"before":null,"after":"d5ba0cca4eaa2cd4bcd5abea2564235e022507df","ref":"refs/heads/SPARK-45762-alternative-proposal","pushedAt":"2023-11-08T06:22:23.000Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"mridulm","name":"Mridul Muralidharan","path":"/mridulm","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/1591700?s=80&v=4"},"commit":{"message":"Alternative to validate comments, and to provide reference for PR 43627","shortMessageHtmlLink":"Alternative to validate comments, and to provide reference for PR 43627"}},{"before":"093b7e1846b43273538e9b50898685478ceb6089","after":"89ca8b6065e9f690a492c778262080741d50d94d","ref":"refs/heads/master","pushedAt":"2023-10-30T01:57:49.000Z","pushType":"push","commitsCount":41,"pusher":{"login":"mridulm","name":"Mridul Muralidharan","path":"/mridulm","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/1591700?s=80&v=4"},"commit":{"message":"[SPARK-45605][CORE][SQL][SS][CONNECT][MLLIB][GRAPHX][DSTREAM][PROTOBUF][EXAMPLES] Replace `s.c.MapOps.mapValues` with `s.c.MapOps.view.mapValues`\n\n### What changes were proposed in this pull request?\nThis pr replace `s.c.MapOps.mapValues` with `s.c.MapOps.view.mapValues` due to `s.c.MapOps.mapValues` marked as deprecated since Scala 2.13.0:\n\nhttps://github.com/scala/scala/blob/bf45e199e96383b96a6955520d7d2524c78e6e12/src/library/scala/collection/Map.scala#L256-L262\n\n```scala\n deprecated(\"Use .view.mapValues(f). A future version will include a strict version of this method (for now, .view.mapValues(f).toMap).\", \"2.13.0\")\n def mapValues[W](f: V => W): MapView[K, W] = new MapView.MapValues(this, f)\n```\n\n### Why are the changes needed?\nCleanup deprecated API usage.\n\n### Does this PR introduce _any_ user-facing change?\nNo\n\n### How was this patch tested?\n- Pass GitHub Acitons\n- Packaged the client, manually tested `DFSReadWriteTest/MiniReadWriteTest/PowerIterationClusteringExample`.\n\n### Was this patch authored or co-authored using generative AI tooling?\nNo\n\nCloses #43448 from LuciferYang/SPARK-45605.\n\nLead-authored-by: yangjie01 \nCo-authored-by: YangJie \nSigned-off-by: Sean Owen ","shortMessageHtmlLink":"[SPARK-45605][CORE][SQL][SS][CONNECT][MLLIB][GRAPHX][DSTREAM][PROTOBU…"}},{"before":"34794247f6c390f92ed2c8fcfd758c63a9a9e223","after":"093b7e1846b43273538e9b50898685478ceb6089","ref":"refs/heads/master","pushedAt":"2023-10-30T01:57:33.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"mridulm","name":"Mridul Muralidharan","path":"/mridulm","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/1591700?s=80&v=4"},"commit":{"message":"[SPARK-45648][INFRA] Add `sql/api` and `common/utils` to `modules.py`\n\n### What changes were proposed in this pull request?\nAdd `sql/api` and `common/utils` to `modules.py`\n\n### Why are the changes needed?\nnew modules should be covered in `modules.py`, otherwise related tests maybe wrongly skipped in some cases\n\n### Does this PR introduce _any_ user-facing change?\nno, infra-only\n\n### How was this patch tested?\nci\n\n### Was this patch authored or co-authored using generative AI tooling?\nno\n\nCloses #43501 from zhengruifeng/infra_sql_api.\n\nAuthored-by: Ruifeng Zheng \nSigned-off-by: Hyukjin Kwon ","shortMessageHtmlLink":"[SPARK-45648][INFRA] Add sql/api and common/utils to modules.py"}},{"before":"f08219a083537b028a24c98b633471373a8a480d","after":"34794247f6c390f92ed2c8fcfd758c63a9a9e223","ref":"refs/heads/master","pushedAt":"2023-10-22T10:26:29.000Z","pushType":"push","commitsCount":0,"pusher":{"login":"mridulm","name":"Mridul Muralidharan","path":"/mridulm","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/1591700?s=80&v=4"}},{"before":"d9d0d99dbfa1a1a4a356d739ecc7d56c0e1c5e50","after":"f08219a083537b028a24c98b633471373a8a480d","ref":"refs/heads/master","pushedAt":"2023-10-20T01:53:42.000Z","pushType":"push","commitsCount":1,"pusher":{"login":"mridulm","name":"Mridul Muralidharan","path":"/mridulm","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/1591700?s=80&v=4"},"commit":{"message":"Mark it as DeveloperApi","shortMessageHtmlLink":"Mark it as DeveloperApi"}},{"before":"6cbf8e2d42db5d733cbd0c7d99807f9bc7ec407e","after":"d9d0d99dbfa1a1a4a356d739ecc7d56c0e1c5e50","ref":"refs/heads/master","pushedAt":"2023-10-20T01:51:59.000Z","pushType":"push","commitsCount":2,"pusher":{"login":"mridulm","name":"Mridul Muralidharan","path":"/mridulm","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/1591700?s=80&v=4"},"commit":{"message":"Merge branch 'master' of github.com:apache/spark","shortMessageHtmlLink":"Merge branch 'master' of github.com:apache/spark"}},{"before":"eec090755aa5b7e6048fc004264a8f5d3591df1a","after":"6cbf8e2d42db5d733cbd0c7d99807f9bc7ec407e","ref":"refs/heads/master","pushedAt":"2023-10-20T01:51:50.000Z","pushType":"push","commitsCount":372,"pusher":{"login":"mridulm","name":"Mridul Muralidharan","path":"/mridulm","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/1591700?s=80&v=4"},"commit":{"message":"Expose DeterministicLevel - this is required to override getOutputDeterministicLevel in child RDD","shortMessageHtmlLink":"Expose DeterministicLevel - this is required to override getOutputDet…"}},{"before":"a3d9e0ae0f95a55766078da5d0bf0f74f3c3cfc3","after":"eec090755aa5b7e6048fc004264a8f5d3591df1a","ref":"refs/heads/master","pushedAt":"2023-09-26T16:33:18.000Z","pushType":"push","commitsCount":1876,"pusher":{"login":"mridulm","name":"Mridul Muralidharan","path":"/mridulm","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/1591700?s=80&v=4"},"commit":{"message":"[SPARK-45211][CONNECT] Eliminated ambiguous references in `CloseableIterator#apply` to fix Scala 2.13 daily test\n\n### What changes were proposed in this pull request?\nThis pr eliminated an ambiguous references in `org.apache.spark.sql.connect.client.CloseableIterator#apply` function to make the test case `abandoned query gets INVALID_HANDLE.OPERATION_ABANDONED error` can test pass with Scala 2.13.\n\n### Why are the changes needed?\n`abandoned query gets INVALID_HANDLE.OPERATION_ABANDONED error` failed in the daily test of Scala 2.13:\n- https://github.com/apache/spark/actions/runs/6215331575/job/16868131377\n\n\"image\"\n\n### Does this PR introduce _any_ user-facing change?\nNo\n\n### How was this patch tested?\n- Pass GitHub Actions\n- Manual check\n\nrun\n\n```\ndev/change-scala-version.sh 2.13\nbuild/sbt \"connect/testOnly org.apache.spark.sql.connect.execution.ReattachableExecuteSuite\" -Pscala-2.13\n```\n\n**Before**\n\n```\n[info] ReattachableExecuteSuite:\n[info] - reattach after initial RPC ends (2 seconds, 258 milliseconds)\n[info] - raw interrupted RPC results in INVALID_CURSOR.DISCONNECTED error (30 milliseconds)\n[info] - raw new RPC interrupts previous RPC with INVALID_CURSOR.DISCONNECTED error (21 milliseconds)\n[info] - client INVALID_CURSOR.DISCONNECTED error is retried when rpc sender gets interrupted (602 milliseconds)\n[info] - client INVALID_CURSOR.DISCONNECTED error is retried when other RPC preempts this one (637 milliseconds)\n[info] - abandoned query gets INVALID_HANDLE.OPERATION_ABANDONED error *** FAILED *** (70 milliseconds)\n[info] Expected exception org.apache.spark.SparkException to be thrown, but java.lang.StackOverflowError was thrown (ReattachableExecuteSuite.scala:172)\n[info] org.scalatest.exceptions.TestFailedException:\n[info] at org.scalatest.Assertions.newAssertionFailedException(Assertions.scala:472)\n[info] at org.scalatest.Assertions.newAssertionFailedException$(Assertions.scala:471)\n[info] at org.scalatest.funsuite.AnyFunSuite.newAssertionFailedException(AnyFunSuite.scala:1564)\n[info] at org.scalatest.Assertions.intercept(Assertions.scala:756)\n[info] at org.scalatest.Assertions.intercept$(Assertions.scala:746)\n[info] at org.scalatest.funsuite.AnyFunSuite.intercept(AnyFunSuite.scala:1564)\n[info] at org.apache.spark.sql.connect.execution.ReattachableExecuteSuite.$anonfun$new$18(ReattachableExecuteSuite.scala:172)\n[info] at org.apache.spark.sql.connect.execution.ReattachableExecuteSuite.$anonfun$new$18$adapted(ReattachableExecuteSuite.scala:168)\n[info] at org.apache.spark.sql.connect.SparkConnectServerTest.withCustomBlockingStub(SparkConnectServerTest.scala:222)\n[info] at org.apache.spark.sql.connect.SparkConnectServerTest.withCustomBlockingStub$(SparkConnectServerTest.scala:216)\n[info] at org.apache.spark.sql.connect.execution.ReattachableExecuteSuite.withCustomBlockingStub(ReattachableExecuteSuite.scala:30)\n[info] at org.apache.spark.sql.connect.execution.ReattachableExecuteSuite.$anonfun$new$16(ReattachableExecuteSuite.scala:168)\n[info] at org.apache.spark.sql.connect.execution.ReattachableExecuteSuite.$anonfun$new$16$adapted(ReattachableExecuteSuite.scala:151)\n[info] at org.apache.spark.sql.connect.SparkConnectServerTest.withClient(SparkConnectServerTest.scala:199)\n[info] at org.apache.spark.sql.connect.SparkConnectServerTest.withClient$(SparkConnectServerTest.scala:191)\n[info] at org.apache.spark.sql.connect.execution.ReattachableExecuteSuite.withClient(ReattachableExecuteSuite.scala:30)\n[info] at org.apache.spark.sql.connect.execution.ReattachableExecuteSuite.$anonfun$new$15(ReattachableExecuteSuite.scala:151)\n[info] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)\n[info] at org.scalatest.enablers.Timed$$anon$1.timeoutAfter(Timed.scala:127)\n[info] at org.scalatest.concurrent.TimeLimits$.failAfterImpl(TimeLimits.scala:282)\n[info] at org.scalatest.concurrent.TimeLimits.failAfter(TimeLimits.scala:231)\n[info] at org.scalatest.concurrent.TimeLimits.failAfter$(TimeLimits.scala:230)\n[info] at org.apache.spark.SparkFunSuite.failAfter(SparkFunSuite.scala:69)\n[info] at org.apache.spark.SparkFunSuite.$anonfun$test$2(SparkFunSuite.scala:155)\n[info] at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)\n[info] at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)\n[info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)\n[info] at org.scalatest.Transformer.apply(Transformer.scala:22)\n[info] at org.scalatest.Transformer.apply(Transformer.scala:20)\n[info] at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226)\n[info] at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:227)\n[info] at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224)\n[info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236)\n[info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)\n[info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236)\n[info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218)\n[info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:69)\n[info] at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234)\n[info] at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227)\n[info] at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:69)\n[info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269)\n[info] at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413)\n[info] at scala.collection.immutable.List.foreach(List.scala:333)\n[info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)\n[info] at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396)\n[info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475)\n[info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269)\n[info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268)\n[info] at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1564)\n[info] at org.scalatest.Suite.run(Suite.scala:1114)\n[info] at org.scalatest.Suite.run$(Suite.scala:1096)\n[info] at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1564)\n[info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273)\n[info] at org.scalatest.SuperEngine.runImpl(Engine.scala:535)\n[info] at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273)\n[info] at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272)\n[info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:69)\n[info] at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)\n[info] at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)\n[info] at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)\n[info] at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:69)\n[info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:321)\n[info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:517)\n[info] at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:414)\n[info] at java.util.concurrent.FutureTask.run(FutureTask.java:266)\n[info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)\n[info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)\n[info] at java.lang.Thread.run(Thread.java:750)\n[info] Cause: java.lang.StackOverflowError:\n[info] at org.apache.spark.sql.connect.client.WrappedCloseableIterator.hasNext(CloseableIterator.scala:36)\n[info] at org.apache.spark.sql.connect.client.WrappedCloseableIterator.hasNext(CloseableIterator.scala:36)\n[info] at org.apache.spark.sql.connect.client.WrappedCloseableIterator.hasNext(CloseableIterator.scala:36)\n[info] at org.apache.spark.sql.connect.client.WrappedCloseableIterator.hasNext(CloseableIterator.scala:36)\n[info] at org.apache.spark.sql.connect.client.WrappedCloseableIterator.hasNext(CloseableIterator.scala:36)\n[info] at org.apache.spark.sql.connect.client.WrappedCloseableIterator.hasNext(CloseableIterator.scala:36)\n[info] at org.apache.spark.sql.connect.client.WrappedCloseableIterator.hasNext(CloseableIterator.scala:36)\n[info] at org.apache.spark.sql.connect.client.WrappedCloseableIterator.hasNext(CloseableIterator.scala:36)\n[info] at org.apache.spark.sql.connect.client.WrappedCloseableIterator.hasNext(CloseableIterator.scala:36)\n[info] at org.apache.spark.sql.connect.client.WrappedCloseableIterator.hasNext(CloseableIterator.scala:36)\n[info] at org.apache.spark.sql.connect.client.WrappedCloseableIterator.hasNext(CloseableIterator.scala:36)\n[info] at org.apache.spark.sql.connect.client.WrappedCloseableIterator.hasNext(CloseableIterator.scala:36)\n[info] at org.apache.spark.sql.connect.client.WrappedCloseableIterator.hasNext(CloseableIterator.scala:36)\n[info] at org.apache.spark.sql.connect.client.WrappedCloseableIterator.hasNext(CloseableIterator.scala:36)\n[info] at org.apache.spark.sql.connect.client.WrappedCloseableIterator.hasNext(CloseableIterator.scala:36)\n[info] at org.apache.spark.sql.connect.client.WrappedCloseableIterator.hasNext(CloseableIterator.scala:36)\n...\n[info] - client releases responses directly after consuming them (236 milliseconds)\n[info] - server releases responses automatically when client moves ahead (336 milliseconds)\n[info] - big query (863 milliseconds)\n[info] - big query and slow client (7 seconds, 14 milliseconds)\n[info] - big query with frequent reattach (735 milliseconds)\n[info] - big query with frequent reattach and slow client (7 seconds, 606 milliseconds)\n[info] - long sleeping query (10 seconds, 156 milliseconds)\n[info] Run completed in 34 seconds, 522 milliseconds.\n[info] Total number of tests run: 13\n[info] Suites: completed 1, aborted 0\n[info] Tests: succeeded 12, failed 1, canceled 0, ignored 0, pending 0\n[info] *** 1 TEST FAILED ***\n[error] Failed tests:\n[error] \torg.apache.spark.sql.connect.execution.ReattachableExecuteSuite\n```\n\n**After**\n\n```\n[info] ReattachableExecuteSuite:\n[info] - reattach after initial RPC ends (2 seconds, 134 milliseconds)\n[info] - raw interrupted RPC results in INVALID_CURSOR.DISCONNECTED error (26 milliseconds)\n[info] - raw new RPC interrupts previous RPC with INVALID_CURSOR.DISCONNECTED error (19 milliseconds)\n[info] - client INVALID_CURSOR.DISCONNECTED error is retried when rpc sender gets interrupted (328 milliseconds)\n[info] - client INVALID_CURSOR.DISCONNECTED error is retried when other RPC preempts this one (562 milliseconds)\n[info] - abandoned query gets INVALID_HANDLE.OPERATION_ABANDONED error (46 milliseconds)\n[info] - client releases responses directly after consuming them (231 milliseconds)\n[info] - server releases responses automatically when client moves ahead (359 milliseconds)\n[info] - big query (978 milliseconds)\n[info] - big query and slow client (7 seconds, 50 milliseconds)\n[info] - big query with frequent reattach (703 milliseconds)\n[info] - big query with frequent reattach and slow client (7 seconds, 626 milliseconds)\n[info] - long sleeping query (10 seconds, 141 milliseconds)\n[info] Run completed in 33 seconds, 844 milliseconds.\n[info] Total number of tests run: 13\n[info] Suites: completed 1, aborted 0\n[info] Tests: succeeded 13, failed 0, canceled 0, ignored 0, pending 0\n[info] All tests passed.\n```\n### Was this patch authored or co-authored using generative AI tooling?\nNo\n\nCloses #42981 from LuciferYang/CloseableIterator-apply.\n\nAuthored-by: yangjie01 \nSigned-off-by: yangjie01 ","shortMessageHtmlLink":"[SPARK-45211][CONNECT] Eliminated ambiguous references in `CloseableI…"}},{"before":"60849b78204e69392976420b9a813bed0790e4e9","after":"a3d9e0ae0f95a55766078da5d0bf0f74f3c3cfc3","ref":"refs/heads/master","pushedAt":"2023-03-28T06:00:56.558Z","pushType":"push","commitsCount":163,"pusher":{"login":"mridulm","name":"Mridul Muralidharan","path":"/mridulm","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/1591700?s=80&v=4"},"commit":{"message":"[SPARK-42934][BUILD] Add `spark.hadoop.hadoop.security.key.provider.path` to `scalatest-maven-plugin`\n\n### What changes were proposed in this pull request?\nWhen testing `OrcEncryptionSuite` using maven, all test suites are always skipped. So this pr add `spark.hadoop.hadoop.security.key.provider.path` to `systemProperties` of `scalatest-maven-plugin` to make `OrcEncryptionSuite` can test by maven.\n\n### Why are the changes needed?\nMake `OrcEncryptionSuite` can test by maven.\n\n### Does this PR introduce _any_ user-facing change?\nNo, just for maven test\n\n### How was this patch tested?\n\n- Pass GitHub Actions\n- Manual testing:\n\nrun\n\n```\nbuild/mvn clean install -pl sql/core -DskipTests -am\nbuild/mvn test -pl sql/core -Dtest=none -DwildcardSuites=org.apache.spark.sql.execution.datasources.orc.OrcEncryptionSuite\n```\n\n**Before**\n\n```\nDiscovery starting.\nDiscovery completed in 3 seconds, 218 milliseconds.\nRun starting. Expected test count is: 4\nOrcEncryptionSuite:\n21:57:58.344 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable\n- Write and read an encrypted file !!! CANCELED !!!\n [] was empty org.apache.orc.impl.NullKeyProvider5af5d76f doesn't has the test keys. ORC shim is created with old Hadoop libraries (OrcEncryptionSuite.scala:37)\n- Write and read an encrypted table !!! CANCELED !!!\n [] was empty org.apache.orc.impl.NullKeyProvider5ad6cc21 doesn't has the test keys. ORC shim is created with old Hadoop libraries (OrcEncryptionSuite.scala:65)\n- SPARK-35325: Write and read encrypted nested columns !!! CANCELED !!!\n [] was empty org.apache.orc.impl.NullKeyProvider691124ee doesn't has the test keys. ORC shim is created with old Hadoop libraries (OrcEncryptionSuite.scala:116)\n- SPARK-35992: Write and read fully-encrypted columns with default masking !!! CANCELED !!!\n [] was empty org.apache.orc.impl.NullKeyProvider5403799b doesn't has the test keys. ORC shim is created with old Hadoop libraries (OrcEncryptionSuite.scala:166)\n21:58:00.035 WARN org.apache.spark.sql.execution.datasources.orc.OrcEncryptionSuite:\n\n===== POSSIBLE THREAD LEAK IN SUITE o.a.s.sql.execution.datasources.orc.OrcEncryptionSuite, threads: rpc-boss-3-1 (daemon=true), shuffle-boss-6-1 (daemon=true) =====\n\nRun completed in 5 seconds, 41 milliseconds.\nTotal number of tests run: 0\nSuites: completed 2, aborted 0\nTests: succeeded 0, failed 0, canceled 4, ignored 0, pending 0\nNo tests were executed.\n```\n\n**After**\n\n```\nDiscovery starting.\nDiscovery completed in 3 seconds, 185 milliseconds.\nRun starting. Expected test count is: 4\nOrcEncryptionSuite:\n21:58:46.540 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable\n- Write and read an encrypted file\n- Write and read an encrypted table\n- SPARK-35325: Write and read encrypted nested columns\n- SPARK-35992: Write and read fully-encrypted columns with default masking\n21:58:51.933 WARN org.apache.spark.sql.execution.datasources.orc.OrcEncryptionSuite:\n\n===== POSSIBLE THREAD LEAK IN SUITE o.a.s.sql.execution.datasources.orc.OrcEncryptionSuite, threads: rpc-boss-3-1 (daemon=true), shuffle-boss-6-1 (daemon=true) =====\n\nRun completed in 8 seconds, 708 milliseconds.\nTotal number of tests run: 4\nSuites: completed 2, aborted 0\nTests: succeeded 4, failed 0, canceled 0, ignored 0, pending 0\nAll tests passed.\n```\n\nCloses #40566 from LuciferYang/SPARK-42934-2.\n\nAuthored-by: yangjie01 \nSigned-off-by: Dongjoon Hyun ","shortMessageHtmlLink":"[SPARK-42934][BUILD] Add `spark.hadoop.hadoop.security.key.provider.p…"}},{"before":null,"after":"b31cd21b7bae33e5585294900936fa51d2c5105b","ref":"refs/heads/SPARK-42922","pushedAt":"2023-03-27T18:28:17.398Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"mridulm","name":"Mridul Muralidharan","path":"/mridulm","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/1591700?s=80&v=4"},"commit":{"message":"SPARK-42922: Move from Random to SecureRandom","shortMessageHtmlLink":"SPARK-42922: Move from Random to SecureRandom"}},{"before":"9bf174f9722e34f13bfaede5e59f989bf2a511e9","after":"60849b78204e69392976420b9a813bed0790e4e9","ref":"refs/heads/master","pushedAt":"2023-03-09T17:56:33.791Z","pushType":"push","commitsCount":40,"pusher":{"login":"mridulm","name":"Mridul Muralidharan","path":"/mridulm","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/1591700?s=80&v=4"},"commit":{"message":"[SPARK-42719][CORE] `MapOutputTracker#getMapLocation` should respect `spark.shuffle.reduceLocality.enabled`\n\n### What changes were proposed in this pull request?\n`MapOutputTracker#getMapLocation` should respect `spark.shuffle.reduceLocality.enabled`\n\n### Why are the changes needed?\n\nDiscuss as https://github.com/apache/spark/pull/40307\n\ngetPreferredLocations in ShuffledRowRDD should return Nil at the very beginning in case spark.shuffle.reduceLocality.enabled = false (conceptually).\n\nThis logic is pushed into MapOutputTracker though - and getPreferredLocationsForShuffle honors spark.shuffle.reduceLocality.enabled - but getMapLocation does not.\n\nSo the fix would be to fix getMapLocation to honor the parameter.\n\n### Does this PR introduce _any_ user-facing change?\nNo\n\n### How was this patch tested?\nNew ut\n\nCloses #40339 from jerqi/new_feature.\n\nAuthored-by: roryqi \nSigned-off-by: Mridul Muralidharan gmail.com>","shortMessageHtmlLink":"[SPARK-42719][CORE] MapOutputTracker#getMapLocation should respect …"}},{"before":"c5bfcc994b523148faff20e78bb5f5a0e355382a","after":null,"ref":"refs/heads/support-hook-for-reliable-shuffle","pushedAt":"2023-03-09T07:10:18.626Z","pushType":"branch_deletion","commitsCount":0,"pusher":{"login":"mridulm","name":"Mridul Muralidharan","path":"/mridulm","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/1591700?s=80&v=4"}},{"before":"05027d37904294d63f696b2138a8dd8a827f67cc","after":"c5bfcc994b523148faff20e78bb5f5a0e355382a","ref":"refs/heads/support-hook-for-reliable-shuffle","pushedAt":"2023-03-09T04:51:32.640Z","pushType":"push","commitsCount":1,"pusher":{"login":"mridulm","name":"Mridul Muralidharan","path":"/mridulm","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/1591700?s=80&v=4"},"commit":{"message":"Fix line length","shortMessageHtmlLink":"Fix line length"}},{"before":"d35c7ed5586af87fc26f8bc7488025c236454531","after":"05027d37904294d63f696b2138a8dd8a827f67cc","ref":"refs/heads/support-hook-for-reliable-shuffle","pushedAt":"2023-03-09T02:14:55.679Z","pushType":"push","commitsCount":1,"pusher":{"login":"mridulm","name":"Mridul Muralidharan","path":"/mridulm","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/1591700?s=80&v=4"},"commit":{"message":"Fix import issue","shortMessageHtmlLink":"Fix import issue"}},{"before":"de9e6bf1d578a960d4f297d24852ff70575797fa","after":"d35c7ed5586af87fc26f8bc7488025c236454531","ref":"refs/heads/support-hook-for-reliable-shuffle","pushedAt":"2023-03-09T02:11:29.028Z","pushType":"push","commitsCount":1,"pusher":{"login":"mridulm","name":"Mridul Muralidharan","path":"/mridulm","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/1591700?s=80&v=4"},"commit":{"message":"Fix import issue","shortMessageHtmlLink":"Fix import issue"}},{"before":"5da15de111073bd0cbd378e96f9fc8ba7ac0464e","after":"de9e6bf1d578a960d4f297d24852ff70575797fa","ref":"refs/heads/support-hook-for-reliable-shuffle","pushedAt":"2023-03-09T02:10:41.079Z","pushType":"push","commitsCount":1,"pusher":{"login":"mridulm","name":"Mridul Muralidharan","path":"/mridulm","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/1591700?s=80&v=4"},"commit":{"message":"Address review comments","shortMessageHtmlLink":"Address review comments"}},{"before":"9d559221439183f2e86c81c48239ea564c883655","after":"5da15de111073bd0cbd378e96f9fc8ba7ac0464e","ref":"refs/heads/support-hook-for-reliable-shuffle","pushedAt":"2023-03-08T17:04:47.199Z","pushType":"push","commitsCount":1,"pusher":{"login":"mridulm","name":"Mridul Muralidharan","path":"/mridulm","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/1591700?s=80&v=4"},"commit":{"message":"Fix linter errors","shortMessageHtmlLink":"Fix linter errors"}},{"before":"f451d75f50520d966cce28b749ecf4db5164ff6b","after":"9d559221439183f2e86c81c48239ea564c883655","ref":"refs/heads/support-hook-for-reliable-shuffle","pushedAt":"2023-03-08T09:04:21.060Z","pushType":"push","commitsCount":1,"pusher":{"login":"mridulm","name":"Mridul Muralidharan","path":"/mridulm","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/1591700?s=80&v=4"},"commit":{"message":"Remove small bugfix - defer for later","shortMessageHtmlLink":"Remove small bugfix - defer for later"}},{"before":"100aa3b2d0889b0e7e7f239ec856a20e2dd9187c","after":"f451d75f50520d966cce28b749ecf4db5164ff6b","ref":"refs/heads/support-hook-for-reliable-shuffle","pushedAt":"2023-03-08T09:01:37.206Z","pushType":"push","commitsCount":1,"pusher":{"login":"mridulm","name":"Mridul Muralidharan","path":"/mridulm","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/1591700?s=80&v=4"},"commit":{"message":"Add tests","shortMessageHtmlLink":"Add tests"}}],"hasNextPage":false,"hasPreviousPage":false,"activityType":"all","actor":null,"timePeriod":"all","sort":"DESC","perPage":30,"startCursor":"Y3Vyc29yOnYyOpK7MjAyNC0wMi0yNVQwNzo1NDozNS4wMDAwMDBazwAAAAQEZvpn","endCursor":"Y3Vyc29yOnYyOpK7MjAyMy0wMy0wOFQwOTowMTozNy4yMDYzNDlazwAAAAL-3BVu"}},"title":"Activity · mridulm/spark"}