Skip to content

mapOncePerWorker will no longer ignore failed execution #298

mapOncePerWorker will no longer ignore failed execution

mapOncePerWorker will no longer ignore failed execution #298

GitHub Actions / Unit Test Results succeeded Oct 23, 2023 in 0s

All 825 tests pass, 99 skipped in 43m 42s

924 tests   - 6   825 ✔️  - 1   43m 42s ⏱️ + 1m 34s
126 suites ±0     99 💤 ±0 
126 files   ±0       0  - 5 

Results for commit 698b295. ± Comparison against earlier commit 83d071a.

Annotations

Check notice on line 0 in .github

See this annotation in the file changed.

@github-actions github-actions / Unit Test Results

99 skipped tests found

There are 99 skipped tests, see "Raw output" for the full list of skipped tests.
Raw output
com.tribbloids.spookystuff.CleanableSpike ‑ wait for closing
com.tribbloids.spookystuff.assembly.JarHellDetection ‑ jars conflict
com.tribbloids.spookystuff.doc.TestUnstructured ‑ Unstructured is serializable for td
com.tribbloids.spookystuff.doc.TestUnstructured ‑ Unstructured is serializable for tr
com.tribbloids.spookystuff.extractors.ScalaDynamicExtractorSuite ‑ Performance test: Java reflection should be faster than ScalaReflection
com.tribbloids.spookystuff.extractors.ScalaDynamicExtractorSuite ‑ can resolve Action.dryrun
com.tribbloids.spookystuff.extractors.ScalaDynamicExtractorSuite ‑ can resolve Array[String].length
com.tribbloids.spookystuff.extractors.ScalaDynamicExtractorSuite ‑ can resolve Doc.uri
com.tribbloids.spookystuff.extractors.ScalaDynamicExtractorSuite ‑ can resolve Fetched.timestamp
com.tribbloids.spookystuff.extractors.ScalaDynamicExtractorSuite ‑ can resolve String.concat(String)
com.tribbloids.spookystuff.extractors.ScalaDynamicExtractorSuite ‑ can resolve Unstructured.code
com.tribbloids.spookystuff.extractors.ScalaDynamicExtractorSuite ‑ can resolve a defined class method
com.tribbloids.spookystuff.extractors.ScalaDynamicExtractorSuite ‑ can resolve a defined class method that has option return type
com.tribbloids.spookystuff.extractors.ScalaDynamicExtractorSuite ‑ can resolve function of String.startsWith(String) using Java
com.tribbloids.spookystuff.extractors.ScalaDynamicExtractorSuite ‑ can resolve type of List[String].head
com.tribbloids.spookystuff.extractors.ScalaDynamicExtractorSuite ‑ can resolve type of Seq[String].head
com.tribbloids.spookystuff.extractors.ScalaDynamicExtractorSuite ‑ cannot resolve function when arg type is NULL
com.tribbloids.spookystuff.extractors.ScalaDynamicExtractorSuite ‑ cannot resolve function when base type is NULL
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike ‑ getMethodByJava should return None if parameter Type is incorrect
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike ‑ getMethodByJava should work on function with option parameter
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike ‑ getMethodByScala should work on function with option parameter
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike_Generic ‑ getMethodByJava should return None if parameter Type is incorrect
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike_Generic ‑ getMethodByJava should work on function with option parameter
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike_Generic ‑ getMethodByScala should work on function with option parameter
com.tribbloids.spookystuff.integration.fetch.FetchInteractionsIT ‑ DocCacheAware(<function1>)/PhantomJS//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.fetch.FetchInteractionsIT ‑ DocCacheAware(<function1>)/TaskLocal(PhantomJS)//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.fetch.FetchInteractionsIT ‑ Narrow/PhantomJS//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.fetch.FetchInteractionsIT ‑ Narrow/TaskLocal(PhantomJS)//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.fetch.FetchInteractionsIT ‑ Wide(<function1>)/PhantomJS//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.fetch.FetchInteractionsIT ‑ Wide(<function1>)/TaskLocal(PhantomJS)//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.lifespan.GCCleaningSpike ‑ System.gc() can dispose unreachable object <: class with finalizer
com.tribbloids.spookystuff.lifespan.GCCleaningSpike ‑ System.gc() can dispose unreachable object registered to a cleaner
com.tribbloids.spookystuff.lifespan.GCCleaningSpike ‑ System.gc() can dispose unreachable object registered to a phantom reference cleanup thread
com.tribbloids.spookystuff.lifespan.GCCleaningSpike ‑ System.gc() can dispose unreachable object registered to a weak reference cleanup thread
com.tribbloids.spookystuff.lifespan.GCCleaningSpike ‑ System.gc() can dispose unreachable object with finalizer
com.tribbloids.spookystuff.parsing.ParsersBenchmark ‑ replace N
com.tribbloids.spookystuff.relay.RelaySuite ‑ SerializingParam[Function1] should work
com.tribbloids.spookystuff.spike.SlowRDDSpike ‑ RDD
com.tribbloids.spookystuff.spike.SlowRDDSpike ‑ is repartitioning non-blocking? dataset
com.tribbloids.spookystuff.utils.SpookyViewsSuite ‑ ... even if the RDD is not Serializable
com.tribbloids.spookystuff.utils.io.HDFSResolverSpike ‑ HDFSResolver can read from FTP server
com.tribbloids.spookystuff.utils.io.HDFSResolverSpike ‑ low level test case
com.tribbloids.spookystuff.utils.io.HDFSResolverSuite ‑ Lock can guarantee sequential access ... even for non existing path
com.tribbloids.spookystuff.utils.io.HDFSResolverSuite ‑ Lock can guarantee sequential read and write to non-existing file
com.tribbloids.spookystuff.utils.io.HDFSResolverSuite ‑ move different files to the same target should be sequential
com.tribbloids.spookystuff.utils.io.HDFSResolverSuite ‑ touch should be sequential
com.tribbloids.spookystuff.utils.io.LocalResolverSuite ‑ Lock can guarantee sequential access ... even for non existing path
com.tribbloids.spookystuff.utils.io.LocalResolverSuite ‑ Lock can guarantee sequential read and write to non-existing file
com.tribbloids.spookystuff.utils.io.LocalResolverSuite ‑ move different files to the same target should be sequential
com.tribbloids.spookystuff.utils.io.LocalResolverSuite ‑ touch should be sequential
com.tribbloids.spookystuff.utils.locality.LocalityImplSuite ‑ each partition of the first operand of cogroup should not move, but elements are shuffled
com.tribbloids.spookystuff.utils.serialization.BeforeAndAfterShippingSuite ‑ can serialize self
com.tribbloids.spookystuff.web.actions.TestTrace_HtmlUnit ‑ css selector should work
com.tribbloids.spookystuff.web.actions.TestTrace_HtmlUnit ‑ sizzle selector should work
com.tribbloids.spookystuff.web.actions.TestTrace_HtmlUnit ‑ visit and snapshot
com.tribbloids.spookystuff.web.actions.TestTrace_HtmlUnit ‑ visit should handle corsera
com.tribbloids.spookystuff.web.actions.TestTrace_HtmlUnit ‑ visit, input submit and snapshot
com.tribbloids.spookystuff.web.actions.TestTrace_HtmlUnit_TaskLocal ‑ css selector should work
com.tribbloids.spookystuff.web.actions.TestTrace_HtmlUnit_TaskLocal ‑ sizzle selector should work
com.tribbloids.spookystuff.web.actions.TestTrace_HtmlUnit_TaskLocal ‑ visit and snapshot
com.tribbloids.spookystuff.web.actions.TestTrace_HtmlUnit_TaskLocal ‑ visit should handle corsera
com.tribbloids.spookystuff.web.actions.TestTrace_HtmlUnit_TaskLocal ‑ visit, input submit and snapshot
com.tribbloids.spookystuff.web.actions.TestTrace_PhantomJS ‑ css selector should work
com.tribbloids.spookystuff.web.actions.TestTrace_PhantomJS ‑ sizzle selector should work
com.tribbloids.spookystuff.web.actions.TestTrace_PhantomJS ‑ visit and snapshot
com.tribbloids.spookystuff.web.actions.TestTrace_PhantomJS ‑ visit should handle corsera
com.tribbloids.spookystuff.web.actions.TestTrace_PhantomJS ‑ visit, input submit and snapshot
com.tribbloids.spookystuff.web.actions.TestTrace_PhantomJS_TaskLocal ‑ css selector should work
com.tribbloids.spookystuff.web.actions.TestTrace_PhantomJS_TaskLocal ‑ sizzle selector should work
com.tribbloids.spookystuff.web.actions.TestTrace_PhantomJS_TaskLocal ‑ visit and snapshot
com.tribbloids.spookystuff.web.actions.TestTrace_PhantomJS_TaskLocal ‑ visit should handle corsera
com.tribbloids.spookystuff.web.actions.TestTrace_PhantomJS_TaskLocal ‑ visit, input submit and snapshot
org.apache.spark.ml.dsl.utils.RecursiveEitherAsUnionToJSONSpike ‑ JSON <=> Union of arity 3
org.apache.spark.ml.dsl.utils.RecursiveEitherAsUnionToJSONSpike ‑ JSON <=> case class with Option[Union] of arity 3
org.apache.spark.ml.dsl.utils.RecursiveEitherAsUnionToJSONSpike ‑ JSON <=> case class with Union of arity 3
org.apache.spark.ml.dsl.utils.XMLWeakDeserializerSuite ‑ missing member to default constructor value
org.apache.spark.ml.dsl.utils.data.Json4sSpike ‑ encode/decode ListMap
org.apache.spark.ml.dsl.utils.data.Json4sSpike ‑ encode/decode Path
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSpike ‑ TypeTag from Type can be serializable
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSpike ‑ can create ClassTag for Array[T]
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSpike ‑ can get TypeTag
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSpike ‑ can get another TypeTag
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSpike ‑ can reflect anon class
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSpike ‑ can reflect lambda
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSpike ‑ scala reflection can be used to get type of Array[String].headOption
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[(String, Int)] ... even if created from raw Type
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[Array[(String, Int)]] ... even if created from raw Type
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[Array[Double]] ... even if created from raw Type
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[Array[Int]] ... even if created from raw Type
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[Array[String]] ... even if created from raw Type
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[Double] ... even if created from raw Type
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[Int] ... even if created from raw Type
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[Seq[(String, Int)]] ... even if created from raw Type
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[Seq[String]] ... even if created from raw Type
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[String] ... even if created from raw Type
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[com.tribbloids.spookystuff.relay.TestBeans.Multipart.type] ... even if created from raw Type
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[com.tribbloids.spookystuff.relay.TestBeans.Multipart] ... even if created from raw Type
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[com.tribbloids.spookystuff.relay.TestBeans.User] ... even if created from raw Type
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[java.sql.Timestamp] ... even if created from raw Type

Check notice on line 0 in .github

See this annotation in the file changed.

@github-actions github-actions / Unit Test Results

924 tests found (test 1 to 553)

There are 924 tests, see "Raw output" for the list of tests 1 to 553.
Raw output
com.tribbloids.spookystuff.CleanableSpike ‑ wait for closing
com.tribbloids.spookystuff.Python3DriverSuite ‑ CommonUtils.withDeadline can interrupt python execution that blocks indefinitely
com.tribbloids.spookystuff.Python3DriverSuite ‑ call should return None if result variable is undefined
com.tribbloids.spookystuff.Python3DriverSuite ‑ can use the correct python version
com.tribbloids.spookystuff.Python3DriverSuite ‑ clean() won't be blocked indefinitely by ongoing python execution
com.tribbloids.spookystuff.Python3DriverSuite ‑ interpret should throw an exception if interpreter raises a multi-line error
com.tribbloids.spookystuff.Python3DriverSuite ‑ interpret should throw an exception if interpreter raises a syntax error
com.tribbloids.spookystuff.Python3DriverSuite ‑ interpret should throw an exception if interpreter raises an error
com.tribbloids.spookystuff.Python3DriverSuite ‑ interpret should yield 1 row for a single print
com.tribbloids.spookystuff.Python3DriverSuite ‑ sendAndGetResult should work if interpretation triggers an error
com.tribbloids.spookystuff.Python3DriverSuite ‑ sendAndGetResult should work in multiple threads
com.tribbloids.spookystuff.Python3DriverSuite ‑ sendAndGetResult should work in single thread
com.tribbloids.spookystuff.SpookyContextSuite ‑ SpookyContext should be Serializable
com.tribbloids.spookystuff.SpookyContextSuite ‑ SpookyContext.dsl should be Serializable
com.tribbloids.spookystuff.SpookyContextSuite ‑ can create PageRow from String
com.tribbloids.spookystuff.SpookyContextSuite ‑ can create PageRow from map[Int, String]
com.tribbloids.spookystuff.SpookyContextSuite ‑ can create PageRow from map[String, String]
com.tribbloids.spookystuff.SpookyContextSuite ‑ can create PageRow from map[Symbol, String]
com.tribbloids.spookystuff.SpookyContextSuite ‑ default SpookyContext should have default dir configs
com.tribbloids.spookystuff.SpookyContextSuite ‑ derived instances of a SpookyContext should have the same configuration
com.tribbloids.spookystuff.SpookyContextSuite ‑ derived instances of a SpookyContext should have the same configuration after it has been modified
com.tribbloids.spookystuff.SpookyContextSuite ‑ each noInput should have independent metrics if sharedMetrics=false
com.tribbloids.spookystuff.SpookyContextSuite ‑ each noInput should have shared metrics if sharedMetrics=true
com.tribbloids.spookystuff.SpookyContextSuite ‑ when sharedMetrics=false, new SpookyContext created from default SpookyConf should have default dir configs
com.tribbloids.spookystuff.SpookyContextSuite ‑ when sharedMetrics=true, new SpookyContext created from default SpookyConf should have default dir configs
com.tribbloids.spookystuff.SpookyExceptionSuite ‑ DFSReadException .getMessage contains causes
com.tribbloids.spookystuff.actions.ActionSuite ‑ Timed mixin can terminate execution if it takes too long
com.tribbloids.spookystuff.actions.ActionSuite ‑ Wget -> JSON
com.tribbloids.spookystuff.actions.ActionSuite ‑ Wget -> treeText
com.tribbloids.spookystuff.actions.ActionSuite ‑ interpolate should not change name
com.tribbloids.spookystuff.actions.ActionSuite ‑ interpolate should not change timeout
com.tribbloids.spookystuff.actions.TestBlock ‑ Try(Wget) can failsafe on malformed uri
com.tribbloids.spookystuff.actions.TestBlock ‑ loop without export won't need driver
com.tribbloids.spookystuff.actions.TestBlock ‑ try without export won't need driver
com.tribbloids.spookystuff.actions.TestBlock ‑ wayback time of loop should be identical to its last child supporting wayback
com.tribbloids.spookystuff.actions.TestOAuth ‑ output of wget should not include session's backtrace
com.tribbloids.spookystuff.actions.TestOAuth ‑ wget should encode malformed url
com.tribbloids.spookystuff.actions.TestOAuth ‑ wget.interpolate should not overwrite each other
com.tribbloids.spookystuff.actions.TestWget ‑ output of wget should not include session's backtrace
com.tribbloids.spookystuff.actions.TestWget ‑ wget should encode malformed url
com.tribbloids.spookystuff.actions.TestWget ‑ wget.interpolate should not overwrite each other
com.tribbloids.spookystuff.assembly.JarHellDetection ‑ jars conflict
com.tribbloids.spookystuff.caching.TestDFSDocCache ‑ cache and restore
com.tribbloids.spookystuff.caching.TestDFSDocCache ‑ cache visit and restore with different name
com.tribbloids.spookystuff.caching.TestDFSDocCache ‑ cache wget and restore with different name
com.tribbloids.spookystuff.caching.TestInMemoryDocCache ‑ cache and restore
com.tribbloids.spookystuff.caching.TestInMemoryDocCache ‑ cache visit and restore with different name
com.tribbloids.spookystuff.caching.TestInMemoryDocCache ‑ cache wget and restore with different name
com.tribbloids.spookystuff.conf.SpookyConfSuite ‑ DirConf.import can read from SparkConf
com.tribbloids.spookystuff.conf.SpookyConfSuite ‑ DirConf.import is serializable
com.tribbloids.spookystuff.conf.SpookyConfSuite ‑ SpookyConf is serializable
com.tribbloids.spookystuff.conf.SpookyConfSuite ‑ getProperty() can load property from spark property
com.tribbloids.spookystuff.conf.SpookyConfSuite ‑ getProperty() can load property from system property
com.tribbloids.spookystuff.doc.TestPageFromAbsoluteFile ‑ childrenWithSiblings
com.tribbloids.spookystuff.doc.TestPageFromAbsoluteFile ‑ childrenWithSiblings with overlapping elimiation
com.tribbloids.spookystuff.doc.TestPageFromAbsoluteFile ‑ wget csv, save and load
com.tribbloids.spookystuff.doc.TestPageFromAbsoluteFile ‑ wget dir, save and load
com.tribbloids.spookystuff.doc.TestPageFromAbsoluteFile ‑ wget html, save and load
com.tribbloids.spookystuff.doc.TestPageFromAbsoluteFile ‑ wget image, save and load
com.tribbloids.spookystuff.doc.TestPageFromAbsoluteFile ‑ wget json, save and load
com.tribbloids.spookystuff.doc.TestPageFromAbsoluteFile ‑ wget pdf, save and load
com.tribbloids.spookystuff.doc.TestPageFromAbsoluteFile ‑ wget xml, save and load
com.tribbloids.spookystuff.doc.TestPageFromFile ‑ childrenWithSiblings
com.tribbloids.spookystuff.doc.TestPageFromFile ‑ childrenWithSiblings with overlapping elimiation
com.tribbloids.spookystuff.doc.TestPageFromFile ‑ wget csv, save and load
com.tribbloids.spookystuff.doc.TestPageFromFile ‑ wget dir, save and load
com.tribbloids.spookystuff.doc.TestPageFromFile ‑ wget html, save and load
com.tribbloids.spookystuff.doc.TestPageFromFile ‑ wget image, save and load
com.tribbloids.spookystuff.doc.TestPageFromFile ‑ wget json, save and load
com.tribbloids.spookystuff.doc.TestPageFromFile ‑ wget pdf, save and load
com.tribbloids.spookystuff.doc.TestPageFromFile ‑ wget xml, save and load
com.tribbloids.spookystuff.doc.TestPageFromHttp ‑ childrenWithSiblings
com.tribbloids.spookystuff.doc.TestPageFromHttp ‑ childrenWithSiblings with overlapping elimiation
com.tribbloids.spookystuff.doc.TestPageFromHttp ‑ wget csv, save and load
com.tribbloids.spookystuff.doc.TestPageFromHttp ‑ wget html, save and load
com.tribbloids.spookystuff.doc.TestPageFromHttp ‑ wget image, save and load
com.tribbloids.spookystuff.doc.TestPageFromHttp ‑ wget json, save and load
com.tribbloids.spookystuff.doc.TestPageFromHttp ‑ wget pdf, save and load
com.tribbloids.spookystuff.doc.TestPageFromHttp ‑ wget xml, save and load
com.tribbloids.spookystuff.doc.TestUnstructured ‑ Unstructured is serializable for div
com.tribbloids.spookystuff.doc.TestUnstructured ‑ Unstructured is serializable for td
com.tribbloids.spookystuff.doc.TestUnstructured ‑ Unstructured is serializable for tr
com.tribbloids.spookystuff.doc.TestUnstructured ‑ attrs should handles empty attributes properly
com.tribbloids.spookystuff.dsl.FilePathsSuite ‑ Flat$ should not encode action parameters that are not in primary constructor
com.tribbloids.spookystuff.dsl.FilePathsSuite ‑ Flat$ should not use default Function.toString
com.tribbloids.spookystuff.dsl.FilePathsSuite ‑ Flat$ should not yield string containing new line character
com.tribbloids.spookystuff.dsl.FilePathsSuite ‑ Hierarchical$ should not encode action parameters that are not in primary constructor
com.tribbloids.spookystuff.dsl.FilePathsSuite ‑ Hierarchical$ should not use default Function.toString
com.tribbloids.spookystuff.dsl.FilePathsSuite ‑ Hierarchical$ should not yield string containing new line character
com.tribbloids.spookystuff.dsl.FilePathsSuite ‑ TimeStampName should not encode action parameters that are not in primary constructor
com.tribbloids.spookystuff.dsl.FilePathsSuite ‑ TimeStampName should not use default Function.toString
com.tribbloids.spookystuff.dsl.FilePathsSuite ‑ TimeStampName should not yield string containing new line character
com.tribbloids.spookystuff.dsl.FilePathsSuite ‑ UUIDName should not encode action parameters that are not in primary constructor
com.tribbloids.spookystuff.dsl.FilePathsSuite ‑ UUIDName should not use default Function.toString
com.tribbloids.spookystuff.dsl.FilePathsSuite ‑ UUIDName should not yield string containing new line character
com.tribbloids.spookystuff.dsl.GenPartitionerSuite ‑ DocCacheAware can co-partition 2 RDDs
com.tribbloids.spookystuff.dsl.TestDSL ‑ SpookyContext can be cast to a blank PageRowRDD with empty schema
com.tribbloids.spookystuff.dsl.TestDSL ‑ andFn
com.tribbloids.spookystuff.dsl.TestDSL ‑ andUnlift
com.tribbloids.spookystuff.dsl.TestDSL ‑ defaultAs should not rename an Alias
com.tribbloids.spookystuff.dsl.TestDSL ‑ double quotes in selector by attribute should work
com.tribbloids.spookystuff.dsl.TestDSL ‑ string interpolation
com.tribbloids.spookystuff.dsl.TestDSL ‑ symbol as Expr
com.tribbloids.spookystuff.dsl.TestDSL ‑ uri
com.tribbloids.spookystuff.execution.SchemaContextSuite ‑ Resolver should not scramble sequence of fields
com.tribbloids.spookystuff.execution.TestExplorePlan ‑ ExplorePlan should create a new beaconRDD if its upstream doesn't have one
com.tribbloids.spookystuff.execution.TestExplorePlan ‑ ExplorePlan should inherit old beaconRDD from upstream if exists
com.tribbloids.spookystuff.execution.TestExplorePlan ‑ ExplorePlan should work recursively on directory
com.tribbloids.spookystuff.execution.TestExplorePlan ‑ ExplorePlan will throw an exception if OrdinalField == DepthField
com.tribbloids.spookystuff.execution.TestExplorePlan ‑ ExplorePlan.toString should work
com.tribbloids.spookystuff.execution.TestExplorePlan ‑ When using custom keyBy function, explore plan can avoid fetching traces with identical nodeKey and preserve keyBy in its output
com.tribbloids.spookystuff.execution.TestExplorePlan ‑ explore plan will avoid shuffling the latest batch to minimize repeated fetch
com.tribbloids.spookystuff.execution.TestExtractPlan ‑ ExtractPlan can append to old values using ~+ operator
com.tribbloids.spookystuff.execution.TestExtractPlan ‑ ExtractPlan can assign aliases to unnamed fields
com.tribbloids.spookystuff.execution.TestExtractPlan ‑ ExtractPlan can erase old values that has a different DataType using ~+ operator
com.tribbloids.spookystuff.execution.TestExtractPlan ‑ ExtractPlan can overwrite old values using ! postfix
com.tribbloids.spookystuff.execution.TestExtractPlan ‑ ExtractPlan cannot partially overwrite old values with the same field id but different DataType
com.tribbloids.spookystuff.execution.TestExtractPlan ‑ In ExtractPlan, weak values are cleaned in case of a conflict
com.tribbloids.spookystuff.execution.TestExtractPlan ‑ In ExtractPlan, weak values are not cleaned if being overwritten using ~! operator
com.tribbloids.spookystuff.execution.TestFetchPlan ‑ FetchPlan is lazy and doesn't immediately do the fetch
com.tribbloids.spookystuff.execution.TestFetchPlan ‑ FetchPlan should be serializable
com.tribbloids.spookystuff.execution.TestFetchPlan ‑ FetchPlan should create a new beaconRDD if its upstream doesn't have one
com.tribbloids.spookystuff.execution.TestFetchPlan ‑ FetchPlan should inherit old beaconRDD from upstream if exists
com.tribbloids.spookystuff.execution.TestFetchPlan ‑ FetchPlan.toString should work
com.tribbloids.spookystuff.execution.TestFetchPlan ‑ fetch() + count() will fetch once
com.tribbloids.spookystuff.execution.TestFetchPlan ‑ fetch() + select() + count() will fetch once
com.tribbloids.spookystuff.execution.TestFlattenPlan ‑ FlattenPlan should work on collection
com.tribbloids.spookystuff.execution.TestFlattenPlan ‑ FlattenPlan should work on collection if not manually set alias
com.tribbloids.spookystuff.execution.TestFlattenPlan ‑ FlattenPlan should work on collection if overwriting defaultJoinField
com.tribbloids.spookystuff.execution.TestFlattenPlan ‑ FlattenPlan should work on extracted List
com.tribbloids.spookystuff.execution.TestFlattenPlan ‑ FlattenPlan should work on extracted Seq
com.tribbloids.spookystuff.execution.TestFlattenPlan ‑ FlattenPlan should work on extracted array
com.tribbloids.spookystuff.execution.TestFlattenPlan ‑ FlattenPlan should work on partial collection
com.tribbloids.spookystuff.execution.TestFlattenPlan ‑ flatExtract is equivalent to flatten + extract
com.tribbloids.spookystuff.execution.TestFlattenPlan ‑ flatExtract is equivalent to flatten + extract if not manually set join key
com.tribbloids.spookystuff.extractors.ColSuite ‑ Col(Lit) ==
com.tribbloids.spookystuff.extractors.ColSuite ‑ Col(Lit).toString
com.tribbloids.spookystuff.extractors.ColSuite ‑ Col(Lit).value
com.tribbloids.spookystuff.extractors.ColSuite ‑ Col(Symbol).toString
com.tribbloids.spookystuff.extractors.ExtractorsSuite ‑ Literal -> JSON
com.tribbloids.spookystuff.extractors.ExtractorsSuite ‑ Literal.toString
com.tribbloids.spookystuff.extractors.GenExtractorSuite ‑ Alias(fromFn) and all its resolved functions are serializable
com.tribbloids.spookystuff.extractors.GenExtractorSuite ‑ Alias(fromFn).apply won't execute twice
com.tribbloids.spookystuff.extractors.GenExtractorSuite ‑ Alias(fromFn).applyOrElse won't execute twice
com.tribbloids.spookystuff.extractors.GenExtractorSuite ‑ Alias(fromFn).isDefined won't execute twice
com.tribbloids.spookystuff.extractors.GenExtractorSuite ‑ Alias(fromFn).lift.apply won't execute twice
com.tribbloids.spookystuff.extractors.GenExtractorSuite ‑ Alias(fromOptionFn) and all its resolved functions are serializable
com.tribbloids.spookystuff.extractors.GenExtractorSuite ‑ Alias(fromOptionFn).apply won't execute twice
com.tribbloids.spookystuff.extractors.GenExtractorSuite ‑ Alias(fromOptionFn).applyOrElse won't execute twice
com.tribbloids.spookystuff.extractors.GenExtractorSuite ‑ Alias(fromOptionFn).isDefined won't execute twice
com.tribbloids.spookystuff.extractors.GenExtractorSuite ‑ Alias(fromOptionFn).lift.apply won't execute twice
com.tribbloids.spookystuff.extractors.GenExtractorSuite ‑ Some(optionFn) is serializable
com.tribbloids.spookystuff.extractors.GenExtractorSuite ‑ Some(partialFn) is serializable
com.tribbloids.spookystuff.extractors.GenExtractorSuite ‑ fromFn and all its resolved functions are serializable
com.tribbloids.spookystuff.extractors.GenExtractorSuite ‑ fromFn.apply won't execute twice
com.tribbloids.spookystuff.extractors.GenExtractorSuite ‑ fromFn.applyOrElse won't execute twice
com.tribbloids.spookystuff.extractors.GenExtractorSuite ‑ fromFn.isDefined won't execute twice
com.tribbloids.spookystuff.extractors.GenExtractorSuite ‑ fromFn.lift.apply won't execute twice
com.tribbloids.spookystuff.extractors.GenExtractorSuite ‑ fromOptionFn and all its resolved functions are serializable
com.tribbloids.spookystuff.extractors.GenExtractorSuite ‑ fromOptionFn.apply won't execute twice
com.tribbloids.spookystuff.extractors.GenExtractorSuite ‑ fromOptionFn.applyOrElse won't execute twice
com.tribbloids.spookystuff.extractors.GenExtractorSuite ‑ fromOptionFn.isDefined won't execute twice
com.tribbloids.spookystuff.extractors.GenExtractorSuite ‑ fromOptionFn.lift.apply won't execute twice
com.tribbloids.spookystuff.extractors.ScalaDynamicExtractorSuite ‑ Performance test: Java reflection should be faster than ScalaReflection
com.tribbloids.spookystuff.extractors.ScalaDynamicExtractorSuite ‑ can resolve Action.dryrun
com.tribbloids.spookystuff.extractors.ScalaDynamicExtractorSuite ‑ can resolve Array[String].length
com.tribbloids.spookystuff.extractors.ScalaDynamicExtractorSuite ‑ can resolve Doc.uri
com.tribbloids.spookystuff.extractors.ScalaDynamicExtractorSuite ‑ can resolve Fetched.timestamp
com.tribbloids.spookystuff.extractors.ScalaDynamicExtractorSuite ‑ can resolve String.concat(String)
com.tribbloids.spookystuff.extractors.ScalaDynamicExtractorSuite ‑ can resolve Unstructured.code
com.tribbloids.spookystuff.extractors.ScalaDynamicExtractorSuite ‑ can resolve a defined class method
com.tribbloids.spookystuff.extractors.ScalaDynamicExtractorSuite ‑ can resolve a defined class method that has option return type
com.tribbloids.spookystuff.extractors.ScalaDynamicExtractorSuite ‑ can resolve function of String.startsWith(String) using Java
com.tribbloids.spookystuff.extractors.ScalaDynamicExtractorSuite ‑ can resolve type of List[String].head
com.tribbloids.spookystuff.extractors.ScalaDynamicExtractorSuite ‑ can resolve type of Seq[String].head
com.tribbloids.spookystuff.extractors.ScalaDynamicExtractorSuite ‑ cannot resolve function when arg type is NULL
com.tribbloids.spookystuff.extractors.ScalaDynamicExtractorSuite ‑ cannot resolve function when base type is NULL
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike ‑ getMethodByJava should return None if parameter Type is incorrect
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike ‑ getMethodByJava should work on function with option output
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike ‑ getMethodByJava should work on function with option parameter
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike ‑ getMethodByJava should work on operator
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike ‑ getMethodByJava should work on overloaded function
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike ‑ getMethodByScala should throw error if parameter Type is incorrect
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike ‑ getMethodByScala should work on function with option output
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike ‑ getMethodByScala should work on function with option parameter
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike ‑ getMethodByScala should work on operator
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike ‑ getMethodByScala should work on overloaded function
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike ‑ getMethodsByName should work on case constructor parameter
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike ‑ getMethodsByName should work on function with default parameters
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike ‑ getMethodsByName should work on lazy val property
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike ‑ getMethodsByName should work on operator
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike ‑ getMethodsByName should work on overloaded function
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike_Generic ‑ getMethodByJava should return None if parameter Type is incorrect
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike_Generic ‑ getMethodByJava should work on function with option output
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike_Generic ‑ getMethodByJava should work on function with option parameter
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike_Generic ‑ getMethodByJava should work on operator
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike_Generic ‑ getMethodByJava should work on overloaded function
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike_Generic ‑ getMethodByScala should throw error if parameter Type is incorrect
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike_Generic ‑ getMethodByScala should work on function with option output
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike_Generic ‑ getMethodByScala should work on function with option parameter
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike_Generic ‑ getMethodByScala should work on operator
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike_Generic ‑ getMethodByScala should work on overloaded function
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike_Generic ‑ getMethodsByName should work on case constructor parameter
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike_Generic ‑ getMethodsByName should work on function with default parameters
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike_Generic ‑ getMethodsByName should work on lazy val property
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike_Generic ‑ getMethodsByName should work on operator
com.tribbloids.spookystuff.extractors.ScalaReflectionSpike_Generic ‑ getMethodsByName should work on overloaded function
com.tribbloids.spookystuff.graph.FlowLayoutSuite ‑ ... implicitly
com.tribbloids.spookystuff.graph.FlowLayoutSuite ‑ Operand from EdgeData
com.tribbloids.spookystuff.graph.FlowLayoutSuite ‑ Operand from NodeData
com.tribbloids.spookystuff.graph.FlowLayoutSuite ‑ acyclic detached edge >>> detached edge
com.tribbloids.spookystuff.graph.FlowLayoutSuite ‑ acyclic node >>> edge >>> node
com.tribbloids.spookystuff.graph.FlowLayoutSuite ‑ acyclic node >>> node
com.tribbloids.spookystuff.graph.FlowLayoutSuite ‑ bidirectional node >>> node <<< node
com.tribbloids.spookystuff.graph.FlowLayoutSuite ‑ cyclic (2 edges >- node ) >>> itself
com.tribbloids.spookystuff.graph.FlowLayoutSuite ‑ cyclic edge-node >>> itself
com.tribbloids.spookystuff.graph.FlowLayoutSuite ‑ cyclic node >>> edge >>> same node
com.tribbloids.spookystuff.graph.FlowLayoutSuite ‑ cyclic node >>> itself
com.tribbloids.spookystuff.integration.explore.ExploreClickNextPageIT ‑ DocCacheAware(<function1>)/PhantomJS//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.explore.ExploreClickNextPageIT ‑ DocCacheAware(<function1>)/TaskLocal(PhantomJS)//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.explore.ExploreClickNextPageIT ‑ Narrow/PhantomJS//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.explore.ExploreClickNextPageIT ‑ Narrow/TaskLocal(PhantomJS)//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.explore.ExploreClickNextPageIT ‑ Wide(<function1>)/PhantomJS//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.explore.ExploreClickNextPageIT ‑ Wide(<function1>)/TaskLocal(PhantomJS)//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.explore.ExploreDirectoriesIT ‑ DocCacheAware(<function1>)/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.explore.ExploreDirectoriesIT ‑ Narrow/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.explore.ExploreDirectoriesIT ‑ Wide(<function1>)/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.explore.ExploreDirectoriesWithSchemeIT ‑ DocCacheAware(<function1>)/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.explore.ExploreDirectoriesWithSchemeIT ‑ Narrow/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.explore.ExploreDirectoriesWithSchemeIT ‑ Wide(<function1>)/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.explore.ExploreIT ‑ DocCacheAware(<function1>)/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.explore.ExploreIT ‑ Narrow/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.explore.ExploreIT ‑ Wide(<function1>)/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.explore.ExploreNextPageIT ‑ DocCacheAware(<function1>)/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.explore.ExploreNextPageIT ‑ Narrow/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.explore.ExploreNextPageIT ‑ Wide(<function1>)/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.explore.ExplorePagesIT ‑ DocCacheAware(<function1>)/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.explore.ExplorePagesIT ‑ Narrow/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.explore.ExplorePagesIT ‑ Wide(<function1>)/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.explore.JoinAndExplorePagesIT ‑ DocCacheAware(<function1>)/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.explore.JoinAndExplorePagesIT ‑ Narrow/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.explore.JoinAndExplorePagesIT ‑ Wide(<function1>)/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.fetch.FetchClickNextPageIT ‑ DocCacheAware(<function1>)/PhantomJS//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.fetch.FetchClickNextPageIT ‑ DocCacheAware(<function1>)/TaskLocal(PhantomJS)//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.fetch.FetchClickNextPageIT ‑ Narrow/PhantomJS//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.fetch.FetchClickNextPageIT ‑ Narrow/TaskLocal(PhantomJS)//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.fetch.FetchClickNextPageIT ‑ Wide(<function1>)/PhantomJS//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.fetch.FetchClickNextPageIT ‑ Wide(<function1>)/TaskLocal(PhantomJS)//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.fetch.FetchInteractionsIT ‑ DocCacheAware(<function1>)/PhantomJS//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.fetch.FetchInteractionsIT ‑ DocCacheAware(<function1>)/TaskLocal(PhantomJS)//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.fetch.FetchInteractionsIT ‑ Narrow/PhantomJS//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.fetch.FetchInteractionsIT ‑ Narrow/TaskLocal(PhantomJS)//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.fetch.FetchInteractionsIT ‑ Wide(<function1>)/PhantomJS//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.fetch.FetchInteractionsIT ‑ Wide(<function1>)/TaskLocal(PhantomJS)//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.fetch.FetchTryOAuthWgetIT ‑ DocCacheAware(<function1>)/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.fetch.FetchTryOAuthWgetIT ‑ Narrow/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.fetch.FetchTryOAuthWgetIT ‑ Wide(<function1>)/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.fetch.FetchTryWgetIT ‑ DocCacheAware(<function1>)/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.fetch.FetchTryWgetIT ‑ Narrow/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.fetch.FetchTryWgetIT ‑ Wide(<function1>)/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.fetch.FetchVisitIT ‑ DocCacheAware(<function1>)/PhantomJS//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.fetch.FetchVisitIT ‑ DocCacheAware(<function1>)/TaskLocal(PhantomJS)//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.fetch.FetchVisitIT ‑ Narrow/PhantomJS//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.fetch.FetchVisitIT ‑ Narrow/TaskLocal(PhantomJS)//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.fetch.FetchVisitIT ‑ Wide(<function1>)/PhantomJS//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.fetch.FetchVisitIT ‑ Wide(<function1>)/TaskLocal(PhantomJS)//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.fetch.FetchWgetAndSaveIT ‑ DocCacheAware(<function1>)/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.fetch.FetchWgetAndSaveIT ‑ Narrow/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.fetch.FetchWgetAndSaveIT ‑ Wide(<function1>)/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.join.InnerVisitJoinIT ‑ DocCacheAware(<function1>)/PhantomJS//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.join.InnerVisitJoinIT ‑ DocCacheAware(<function1>)/TaskLocal(PhantomJS)//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.join.InnerVisitJoinIT ‑ Narrow/PhantomJS//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.join.InnerVisitJoinIT ‑ Narrow/TaskLocal(PhantomJS)//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.join.InnerVisitJoinIT ‑ Wide(<function1>)/PhantomJS//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.join.InnerVisitJoinIT ‑ Wide(<function1>)/TaskLocal(PhantomJS)//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.join.InnerWgetJoinIT ‑ DocCacheAware(<function1>)/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.join.InnerWgetJoinIT ‑ Narrow/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.join.InnerWgetJoinIT ‑ Wide(<function1>)/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.join.LeftVisitJoinIT ‑ DocCacheAware(<function1>)/PhantomJS//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.join.LeftVisitJoinIT ‑ DocCacheAware(<function1>)/TaskLocal(PhantomJS)//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.join.LeftVisitJoinIT ‑ Narrow/PhantomJS//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.join.LeftVisitJoinIT ‑ Narrow/TaskLocal(PhantomJS)//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.join.LeftVisitJoinIT ‑ Wide(<function1>)/PhantomJS//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.join.LeftVisitJoinIT ‑ Wide(<function1>)/TaskLocal(PhantomJS)//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.join.LeftWgetJoinIT ‑ DocCacheAware(<function1>)/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.join.LeftWgetJoinIT ‑ Narrow/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.join.LeftWgetJoinIT ‑ Wide(<function1>)/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.select.ChainedFlatSelectIT ‑ DocCacheAware(<function1>)/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.select.ChainedFlatSelectIT ‑ Narrow/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.select.ChainedFlatSelectIT ‑ Wide(<function1>)/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.select.FlatSelectIT ‑ DocCacheAware(<function1>)/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.select.FlatSelectIT ‑ Narrow/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.select.FlatSelectIT ‑ Wide(<function1>)/null//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.select.SelectIT ‑ DocCacheAware(<function1>)/PhantomJS//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.select.SelectIT ‑ DocCacheAware(<function1>)/TaskLocal(PhantomJS)//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.select.SelectIT ‑ Narrow/PhantomJS//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.select.SelectIT ‑ Narrow/TaskLocal(PhantomJS)//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.select.SelectIT ‑ Wide(<function1>)/PhantomJS//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.integration.select.SelectIT ‑ Wide(<function1>)/TaskLocal(PhantomJS)//home/runner/work/spookystuff/spookystuff/parent/integration/temp/spooky-integration
com.tribbloids.spookystuff.lifespan.CleanableSuite ‑ Lifespan.JVM is serializable
com.tribbloids.spookystuff.lifespan.CleanableSuite ‑ Lifespan.JVM.batchID is serializable
com.tribbloids.spookystuff.lifespan.CleanableSuite ‑ Lifespan.Task is serializable
com.tribbloids.spookystuff.lifespan.CleanableSuite ‑ Lifespan._id should be updated after being shipped to driver
com.tribbloids.spookystuff.lifespan.CleanableSuite ‑ Lifespan.batchIDs should be updated after being shipped to a different executor
com.tribbloids.spookystuff.lifespan.CleanableSuite ‑ Lifespan.batchIDs should be updated after being shipped to a new thread created by a different executor
com.tribbloids.spookystuff.lifespan.CleanableSuite ‑ can get all created Cleanables
com.tribbloids.spookystuff.lifespan.CleanableSuite ‑ can get all created Cleanables even their hashcodes may overlap
com.tribbloids.spookystuff.lifespan.GCCleaningSpike ‑ System.gc() can dispose unreachable object <: class with finalizer
com.tribbloids.spookystuff.lifespan.GCCleaningSpike ‑ System.gc() can dispose unreachable object registered to a cleaner
com.tribbloids.spookystuff.lifespan.GCCleaningSpike ‑ System.gc() can dispose unreachable object registered to a phantom reference cleanup thread
com.tribbloids.spookystuff.lifespan.GCCleaningSpike ‑ System.gc() can dispose unreachable object registered to a weak reference cleanup thread
com.tribbloids.spookystuff.lifespan.GCCleaningSpike ‑ System.gc() can dispose unreachable object with finalizer
com.tribbloids.spookystuff.lifespan.MinimalCleanerSpike ‑ ..
com.tribbloids.spookystuff.metrics.AccSuite ‑ FromV0
com.tribbloids.spookystuff.metrics.AccSuite ‑ Simple
com.tribbloids.spookystuff.metrics.MetricsSuite ‑ can be converted to JSON
com.tribbloids.spookystuff.metrics.MetricsSuite ‑ tree can be converted to JSON
com.tribbloids.spookystuff.parsing.FSMParserDSLSuite ‑ can form linear graph
com.tribbloids.spookystuff.parsing.FSMParserDSLSuite ‑ can form loop
com.tribbloids.spookystuff.parsing.FSMParserDSLSuite ‑ can form non-linear graph :~>
com.tribbloids.spookystuff.parsing.FSMParserDSLSuite ‑ can form non-linear graph <~:
com.tribbloids.spookystuff.parsing.FSMParserDSLSuite ‑ can form self-loop
com.tribbloids.spookystuff.parsing.FSMParserDSLSuite ‑ self-loop can union with others
com.tribbloids.spookystuff.parsing.ParsersBenchmark ‑ replace N
com.tribbloids.spookystuff.parsing.ParsingRunSuite ‑ backtracking unclosed bracket
com.tribbloids.spookystuff.parsing.ParsingRunSuite ‑ conditional can parse paired brackets
com.tribbloids.spookystuff.parsing.ParsingRunSuite ‑ linear for 1 rule
com.tribbloids.spookystuff.parsing.ParsingRunSuite ‑ linear for 4 rules in 2 stages + EOS
com.tribbloids.spookystuff.parsing.ParsingRunSuite ‑ linear for 4 rules with diamond path
com.tribbloids.spookystuff.parsing.ParsingRunSuite ‑ loop escape by \
com.tribbloids.spookystuff.parsing.ParsingRunSuite ‑ loop multiple pair brackets
com.tribbloids.spookystuff.python.ref.PyRefSuite ‑ CaseExample can initialize Python instance after constructor parameter has been changed
com.tribbloids.spookystuff.python.ref.PyRefSuite ‑ CaseExample can initialize Python instance with missing constructor parameter
com.tribbloids.spookystuff.python.ref.PyRefSuite ‑ JSONInstanceRef can initialize Python instance after constructor parameter has been changed
com.tribbloids.spookystuff.python.ref.PyRefSuite ‑ JSONInstanceRef can initialize Python instance with missing constructor parameter
com.tribbloids.spookystuff.rdd.FetchedDatasetSuite ‑ .map should not run preceding transformation multiple times
com.tribbloids.spookystuff.rdd.FetchedDatasetSuite ‑ .rdd should not run preceding transformation multiple times
com.tribbloids.spookystuff.rdd.FetchedDatasetSuite ‑ explore plan can be persisted
com.tribbloids.spookystuff.rdd.FetchedDatasetSuite ‑ extract plan can be persisted
com.tribbloids.spookystuff.rdd.FetchedDatasetSuite ‑ fetch plan can be persisted
com.tribbloids.spookystuff.rdd.FetchedDatasetSuite ‑ flatten plan can be persisted
com.tribbloids.spookystuff.rdd.FetchedDatasetSuite ‑ savePage ... on persisted RDD
com.tribbloids.spookystuff.rdd.FetchedDatasetSuite ‑ savePage eagerly
com.tribbloids.spookystuff.rdd.FetchedDatasetSuite ‑ savePage lazily
com.tribbloids.spookystuff.rdd.FetchedDatasetSuite ‑ toDF can handle composite types
com.tribbloids.spookystuff.rdd.FetchedDatasetSuite ‑ toDF can handle simple types
com.tribbloids.spookystuff.rdd.FetchedDatasetSuite ‑ toDF can yield a DataFrame excluding Fields with .isSelected = false
com.tribbloids.spookystuff.rdd.FetchedDatasetSuite ‑ toDF(false) should not run preceding transformation multiple times
com.tribbloids.spookystuff.rdd.FetchedDatasetSuite ‑ toDF(true) should not run preceding transformation multiple times
com.tribbloids.spookystuff.rdd.FetchedDatasetSuite ‑ toJSON(false) should not run preceding transformation multiple times
com.tribbloids.spookystuff.rdd.FetchedDatasetSuite ‑ toJSON(true) should not run preceding transformation multiple times
com.tribbloids.spookystuff.rdd.FetchedDatasetSuite ‑ toMapRDD(false) should not run preceding transformation multiple times
com.tribbloids.spookystuff.rdd.FetchedDatasetSuite ‑ toMapRDD(true) should not run preceding transformation multiple times
com.tribbloids.spookystuff.relay.RelayRegistrySuite ‑ lookup can find Relay as companion object
com.tribbloids.spookystuff.relay.RelayRegistrySuite ‑ lookup will throw an exception if companion object is not a Relay
com.tribbloids.spookystuff.relay.RelaySuite ‑ Multipart case class JSON read should be broken
com.tribbloids.spookystuff.relay.RelaySuite ‑ Paranamer constructor lookup
com.tribbloids.spookystuff.relay.RelaySuite ‑ SerializingParam[Function1] should work
com.tribbloids.spookystuff.relay.RelaySuite ‑ can convert even less accurate timestamp
com.tribbloids.spookystuff.relay.RelaySuite ‑ can read generated timestamp
com.tribbloids.spookystuff.relay.RelaySuite ‑ can read less accurate timestamp
com.tribbloids.spookystuff.relay.RelaySuite ‑ can read lossless timestamp
com.tribbloids.spookystuff.relay.RelaySuite ‑ reading a wrapped array with misplaced value & default value should fail early
com.tribbloids.spookystuff.relay.RelaySuite ‑ reading an array with misplaced value & default value should fail early
com.tribbloids.spookystuff.relay.RelaySuite ‑ reading an array with missing value & default value should fail early
com.tribbloids.spookystuff.relay.RelaySuite ‑ reading an object from a converted string should work
com.tribbloids.spookystuff.relay.RelaySuite ‑ reading an object with default value should work
com.tribbloids.spookystuff.relay.RelaySuite ‑ reading an object with missing value & default value should fail early
com.tribbloids.spookystuff.relay.RelaySuite ‑ reading an object with provided value should work
com.tribbloids.spookystuff.relay.TreeIRSuite ‑ from/to JSON round-trip
com.tribbloids.spookystuff.relay.io.FormattedTextSuite ‑ FormattedText can print nested case classes
com.tribbloids.spookystuff.relay.io.FormattedTextSuite ‑ FormattedText can print nested map
com.tribbloids.spookystuff.relay.io.FormattedTextSuite ‑ FormattedText can print nested seq
com.tribbloids.spookystuff.relay.io.FormattedTextSuite ‑ FormattedText treeText by AutomaticRelay on seq
com.tribbloids.spookystuff.relay.io.FormattedTextSuite ‑ FormattedText treeText by AutomaticRelay on seq and map
com.tribbloids.spookystuff.relay.io.FormattedTextSuite ‑ FormattedText treeText by AutomaticRelay on value
com.tribbloids.spookystuff.relay.io.FormattedTextSuite ‑ FormattedText treeText by Relay
com.tribbloids.spookystuff.row.DataRowSuite ‑ formatEmptyString
com.tribbloids.spookystuff.row.DataRowSuite ‑ formatNullString
com.tribbloids.spookystuff.row.DataRowSuite ‑ getInt can extract java.lang.Integer type
com.tribbloids.spookystuff.row.DataRowSuite ‑ getInt can extract scala Int type
com.tribbloids.spookystuff.row.DataRowSuite ‑ getIntArray can extract from Array
com.tribbloids.spookystuff.row.DataRowSuite ‑ getIntArray can extract from Array that has different types
com.tribbloids.spookystuff.row.DataRowSuite ‑ getIntArray can extract from Iterator
com.tribbloids.spookystuff.row.DataRowSuite ‑ getIntArray can extract from Set
com.tribbloids.spookystuff.row.DataRowSuite ‑ getIntIterable can extract from Array
com.tribbloids.spookystuff.row.DataRowSuite ‑ getIntIterable can extract from Array that has different types
com.tribbloids.spookystuff.row.DataRowSuite ‑ getIntIterable can extract from Iterator
com.tribbloids.spookystuff.row.DataRowSuite ‑ getIntIterable can extract from Set
com.tribbloids.spookystuff.row.DataRowSuite ‑ getTyped can extract java.lang.Integer type
com.tribbloids.spookystuff.row.DataRowSuite ‑ getTyped can extract scala Int type
com.tribbloids.spookystuff.row.DataRowSuite ‑ getTyped should return None if type is incompatible
com.tribbloids.spookystuff.row.DataRowSuite ‑ getTypedArray can extract from Array
com.tribbloids.spookystuff.row.DataRowSuite ‑ getTypedArray can extract from Array that has different types
com.tribbloids.spookystuff.row.DataRowSuite ‑ getTypedArray can extract from Iterator
com.tribbloids.spookystuff.row.DataRowSuite ‑ getTypedArray can extract from Set
com.tribbloids.spookystuff.row.DataRowSuite ‑ interpolate
com.tribbloids.spookystuff.row.DataRowSuite ‑ interpolate returns None when key not found
com.tribbloids.spookystuff.row.FetchedRowViewSuite ‑ get page
com.tribbloids.spookystuff.row.FetchedRowViewSuite ‑ get unstructured
com.tribbloids.spookystuff.row.SquashedFetchedRowSuite ‑ Array[Page]().grouping yields at least 1 group
com.tribbloids.spookystuff.row.SquashedFetchedRowSuite ‑ ['a 'b 'a 'b].grouping yields ['a 'b] ['a 'b]
com.tribbloids.spookystuff.spike.SlowRDDSpike ‑ RDD
com.tribbloids.spookystuff.spike.SlowRDDSpike ‑ is repartitioning non-blocking? dataset
com.tribbloids.spookystuff.testutils.SpookyUtilsSuite ‑ RDDs.batchReduce yield the same results as RDDs.map(_.reduce)
com.tribbloids.spookystuff.testutils.SpookyUtilsSuite ‑ RDDs.shufflePartitions can move data into random partitions
com.tribbloids.spookystuff.testutils.SpookyUtilsSuite ‑ asArray[Int]
com.tribbloids.spookystuff.testutils.SpookyUtilsSuite ‑ asIterable[Int]
com.tribbloids.spookystuff.testutils.SpookyUtilsSuite ‑ canonizeUrn should clean ?:$&#
com.tribbloids.spookystuff.testutils.SpookyUtilsSuite ‑ withTimeout can execute heartbeat
com.tribbloids.spookystuff.testutils.SpookyUtilsSuite ‑ withTimeout can write heartbeat info into log by default
com.tribbloids.spookystuff.testutils.SpookyUtilsSuite ‑ withTimeout won't be affected by scala concurrency global ForkJoin thread pool
com.tribbloids.spookystuff.utils.BacktrackingIteratorSuite ‑ can backtrack for arbitrary times
com.tribbloids.spookystuff.utils.BacktrackingIteratorSuite ‑ can backtrack once
com.tribbloids.spookystuff.utils.CachingSuite ‑ Weak ConcurrentCache should remove value on garbage collection if the value is de-referenced
com.tribbloids.spookystuff.utils.CachingSuite ‑ Weak ConcurrentCache should remove value on garbage collection if the value is not in scope
com.tribbloids.spookystuff.utils.CachingSuite ‑ spike exit from a subroutine allows all referenced objected to be GC'ed
com.tribbloids.spookystuff.utils.CachingSuite ‑ spike termination of thread allows all referenced objected to be GC'ed
com.tribbloids.spookystuff.utils.InterpolationSuite ‑ can avoid continuous escaped chars
com.tribbloids.spookystuff.utils.InterpolationSuite ‑ can avoid escaped char
com.tribbloids.spookystuff.utils.InterpolationSuite ‑ can identify delimiter after continuous escaped chars
com.tribbloids.spookystuff.utils.InterpolationSuite ‑ can interpolate delimited field
com.tribbloids.spookystuff.utils.InterpolationSuite ‑ interpolate can use common character as delimiter
com.tribbloids.spookystuff.utils.InterpolationSuite ‑ interpolate can use special regex character as delimiter
com.tribbloids.spookystuff.utils.InterpolationSuite ‑ interpolate should allow delimiter to be escaped
com.tribbloids.spookystuff.utils.InterpolationSuite ‑ interpolate should ignore string that contains delimiter without bracket
com.tribbloids.spookystuff.utils.PreemptiveLocalOpsSuite ‑ can be much faster than toLocalIterator
com.tribbloids.spookystuff.utils.RDDDisperseSuite ‑ Benchmark: can be much faster than
com.tribbloids.spookystuff.utils.RDDDisperseSuite ‑ Checkpointed: Disk Memory Deserialized 1x Replicated RDD with 1 partition
com.tribbloids.spookystuff.utils.RDDDisperseSuite ‑ Checkpointed: Disk Memory Deserialized 1x Replicated RDD with many partitions
com.tribbloids.spookystuff.utils.RDDDisperseSuite ‑ Checkpointed: Disk Memory Deserialized 1x Replicated RDD with skewed partitions most of which are empty
com.tribbloids.spookystuff.utils.RDDDisperseSuite ‑ Checkpointed: Memory Deserialized 1x Replicated RDD with 1 partition
com.tribbloids.spookystuff.utils.RDDDisperseSuite ‑ Checkpointed: Memory Deserialized 1x Replicated RDD with many partitions
com.tribbloids.spookystuff.utils.RDDDisperseSuite ‑ Checkpointed: Memory Deserialized 1x Replicated RDD with skewed partitions most of which are empty
com.tribbloids.spookystuff.utils.RDDDisperseSuite ‑ Checkpointed: Memory Serialized 2x Replicated RDD with 1 partition
com.tribbloids.spookystuff.utils.RDDDisperseSuite ‑ Checkpointed: Memory Serialized 2x Replicated RDD with many partitions
com.tribbloids.spookystuff.utils.RDDDisperseSuite ‑ Checkpointed: Memory Serialized 2x Replicated RDD with skewed partitions most of which are empty
com.tribbloids.spookystuff.utils.RDDDisperseSuite ‑ Persisted: Disk Memory Deserialized 1x Replicated RDD with 1 partition
com.tribbloids.spookystuff.utils.RDDDisperseSuite ‑ Persisted: Disk Memory Deserialized 1x Replicated RDD with many partitions
com.tribbloids.spookystuff.utils.RDDDisperseSuite ‑ Persisted: Disk Memory Deserialized 1x Replicated RDD with skewed partitions most of which are empty
com.tribbloids.spookystuff.utils.RDDDisperseSuite ‑ Persisted: Memory Deserialized 1x Replicated RDD with 1 partition
com.tribbloids.spookystuff.utils.RDDDisperseSuite ‑ Persisted: Memory Deserialized 1x Replicated RDD with many partitions
com.tribbloids.spookystuff.utils.RDDDisperseSuite ‑ Persisted: Memory Deserialized 1x Replicated RDD with skewed partitions most of which are empty
com.tribbloids.spookystuff.utils.RDDDisperseSuite ‑ Persisted: Memory Serialized 2x Replicated RDD with 1 partition
com.tribbloids.spookystuff.utils.RDDDisperseSuite ‑ Persisted: Memory Serialized 2x Replicated RDD with many partitions
com.tribbloids.spookystuff.utils.RDDDisperseSuite ‑ Persisted: Memory Serialized 2x Replicated RDD with skewed partitions most of which are empty
com.tribbloids.spookystuff.utils.RDDDisperseSuite ‑ Persisted_RDDReified: Disk Memory Deserialized 1x Replicated RDD with 1 partition
com.tribbloids.spookystuff.utils.RDDDisperseSuite ‑ Persisted_RDDReified: Disk Memory Deserialized 1x Replicated RDD with many partitions
com.tribbloids.spookystuff.utils.RDDDisperseSuite ‑ Persisted_RDDReified: Disk Memory Deserialized 1x Replicated RDD with skewed partitions most of which are empty
com.tribbloids.spookystuff.utils.RDDDisperseSuite ‑ Persisted_RDDReified: Memory Deserialized 1x Replicated RDD with 1 partition
com.tribbloids.spookystuff.utils.RDDDisperseSuite ‑ Persisted_RDDReified: Memory Deserialized 1x Replicated RDD with many partitions
com.tribbloids.spookystuff.utils.RDDDisperseSuite ‑ Persisted_RDDReified: Memory Deserialized 1x Replicated RDD with skewed partitions most of which are empty
com.tribbloids.spookystuff.utils.RDDDisperseSuite ‑ Persisted_RDDReified: Memory Serialized 2x Replicated RDD with 1 partition
com.tribbloids.spookystuff.utils.RDDDisperseSuite ‑ Persisted_RDDReified: Memory Serialized 2x Replicated RDD with many partitions
com.tribbloids.spookystuff.utils.RDDDisperseSuite ‑ Persisted_RDDReified: Memory Serialized 2x Replicated RDD with skewed partitions most of which are empty
com.tribbloids.spookystuff.utils.RangeHashBenchmark ‑ RangeArg hash should be fast
com.tribbloids.spookystuff.utils.SCFunctionsSuite ‑ withJob can override existing groupID
com.tribbloids.spookystuff.utils.SCFunctionsSuite ‑ withJob can stack description
com.tribbloids.spookystuff.utils.SCFunctionsSuite ‑ withJob will not override existing groupID if not specified
com.tribbloids.spookystuff.utils.ScalaUDTSuite ‑ Action has a datatype
com.tribbloids.spookystuff.utils.ScalaUDTSuite ‑ Array[Action] has a datatype
com.tribbloids.spookystuff.utils.ScalaUDTSuite ‑ Array[Int] has a datatype
com.tribbloids.spookystuff.utils.ScalaUDTSuite ‑ Doc has a datatype
com.tribbloids.spookystuff.utils.ScalaUDTSuite ‑ DocOption has a datatype
com.tribbloids.spookystuff.utils.ScalaUDTSuite ‑ Int has a datatype
com.tribbloids.spookystuff.utils.ScalaUDTSuite ‑ Unstructured has a datatype
com.tribbloids.spookystuff.utils.SpookyViewsSuite ‑ ... even if the RDD is not Serializable
com.tribbloids.spookystuff.utils.SpookyViewsSuite ‑ ... where execution should fail
com.tribbloids.spookystuff.utils.SpookyViewsSuite ‑ :/ can handle null component
com.tribbloids.spookystuff.utils.SpookyViewsSuite ‑ Array.filterByType should work on primitive types
com.tribbloids.spookystuff.utils.SpookyViewsSuite ‑ Seq/Set.filterByType should work on primitive types
com.tribbloids.spookystuff.utils.SpookyViewsSuite ‑ \\ can handle null component
com.tribbloids.spookystuff.utils.SpookyViewsSuite ‑ injectPassthroughPartitioner should not move partitions
com.tribbloids.spookystuff.utils.SpookyViewsSuite ‑ mapOncePerCore
com.tribbloids.spookystuff.utils.SpookyViewsSuite ‑ mapOncePerWorker
com.tribbloids.spookystuff.utils.SpookyViewsSuite ‑ multiPassFlatMap should yield same result as flatMap
com.tribbloids.spookystuff.utils.SpookyViewsSuite ‑ result of allTaskLocationStrs can be used as partition's preferred location
com.tribbloids.spookystuff.utils.SpookyViewsSuite ‑ runEverywhere
com.tribbloids.spookystuff.utils.SpookyViewsSuite ‑ runEverywhere (alsoOnDriver)
com.tribbloids.spookystuff.utils.classpath.ClasspathResolverSpec ‑ copyResourceToDirectory can extract a dependency's package in a jar
com.tribbloids.spookystuff.utils.classpath.ClasspathResolverSpec ‑ copyResourceToDirectory can extract a package in file system
com.tribbloids.spookystuff.utils.io.HDFSResolverSpike ‑ HDFSResolver can read from FTP server
com.tribbloids.spookystuff.utils.io.HDFSResolverSpike ‑ low level test case
com.tribbloids.spookystuff.utils.io.HDFSResolverSuite ‑ ... on executors
com.tribbloids.spookystuff.utils.io.HDFSResolverSuite ‑ .toAbsolute is idempotent
com.tribbloids.spookystuff.utils.io.HDFSResolverSuite ‑ Lock can guarantee sequential access ... even for non existing path
com.tribbloids.spookystuff.utils.io.HDFSResolverSuite ‑ Lock can guarantee sequential access to empty directory
com.tribbloids.spookystuff.utils.io.HDFSResolverSuite ‑ Lock can guarantee sequential access to existing file
com.tribbloids.spookystuff.utils.io.HDFSResolverSuite ‑ Lock can guarantee sequential access to non empty directory
com.tribbloids.spookystuff.utils.io.HDFSResolverSuite ‑ Lock can guarantee sequential read and write to existing file
com.tribbloids.spookystuff.utils.io.HDFSResolverSuite ‑ Lock can guarantee sequential read and write to non-existing file
com.tribbloids.spookystuff.utils.io.HDFSResolverSuite ‑ can convert absolute path of non-existing file
com.tribbloids.spookystuff.utils.io.HDFSResolverSuite ‑ can convert path with schema of non-existing file
com.tribbloids.spookystuff.utils.io.HDFSResolverSuite ‑ can convert path with schema// of non-existing file
com.tribbloids.spookystuff.utils.io.HDFSResolverSuite ‑ can convert relative path of non-existing file
com.tribbloids.spookystuff.utils.io.HDFSResolverSuite ‑ can override login UGI
com.tribbloids.spookystuff.utils.io.HDFSResolverSuite ‑ input all accessors can be mutated after creation
com.tribbloids.spookystuff.utils.io.HDFSResolverSuite ‑ input can get metadata concurrently
com.tribbloids.spookystuff.utils.io.HDFSResolverSuite ‑ move 1 file to different targets should be sequential
com.tribbloids.spookystuff.utils.io.HDFSResolverSuite ‑ move 1 file to the same target should be sequential
com.tribbloids.spookystuff.utils.io.HDFSResolverSuite ‑ move different files to the same target should be sequential
com.tribbloids.spookystuff.utils.io.HDFSResolverSuite ‑ output can automatically create missing directory
com.tribbloids.spookystuff.utils.io.HDFSResolverSuite ‑ output can not overwrite over existing file
com.tribbloids.spookystuff.utils.io.HDFSResolverSuite ‑ output cannot grant multiple OutputStreams for 1 file
com.tribbloids.spookystuff.utils.io.HDFSResolverSuite ‑ output copyTo a new file
com.tribbloids.spookystuff.utils.io.HDFSResolverSuite ‑ output copyTo overwrite an existing file
com.tribbloids.spookystuff.utils.io.HDFSResolverSuite ‑ resolver is serializable
com.tribbloids.spookystuff.utils.io.HDFSResolverSuite ‑ touch should be sequential
com.tribbloids.spookystuff.utils.io.LocalResolverSuite ‑ .toAbsolute is idempotent
com.tribbloids.spookystuff.utils.io.LocalResolverSuite ‑ Lock can guarantee sequential access ... even for non existing path
com.tribbloids.spookystuff.utils.io.LocalResolverSuite ‑ Lock can guarantee sequential access to empty directory
com.tribbloids.spookystuff.utils.io.LocalResolverSuite ‑ Lock can guarantee sequential access to existing file
com.tribbloids.spookystuff.utils.io.LocalResolverSuite ‑ Lock can guarantee sequential access to non empty directory
com.tribbloids.spookystuff.utils.io.LocalResolverSuite ‑ Lock can guarantee sequential read and write to existing file
com.tribbloids.spookystuff.utils.io.LocalResolverSuite ‑ Lock can guarantee sequential read and write to non-existing file
com.tribbloids.spookystuff.utils.io.LocalResolverSuite ‑ can convert absolute path of non-existing file
com.tribbloids.spookystuff.utils.io.LocalResolverSuite ‑ can convert relative path of non-existing file
com.tribbloids.spookystuff.utils.io.LocalResolverSuite ‑ input all accessors can be mutated after creation
com.tribbloids.spookystuff.utils.io.LocalResolverSuite ‑ input can get metadata concurrently
com.tribbloids.spookystuff.utils.io.LocalResolverSuite ‑ move 1 file to different targets should be sequential
com.tribbloids.spookystuff.utils.io.LocalResolverSuite ‑ move 1 file to the same target should be sequential
com.tribbloids.spookystuff.utils.io.LocalResolverSuite ‑ move different files to the same target should be sequential
com.tribbloids.spookystuff.utils.io.LocalResolverSuite ‑ output can automatically create missing directory
com.tribbloids.spookystuff.utils.io.LocalResolverSuite ‑ output can not overwrite over existing file
com.tribbloids.spookystuff.utils.io.LocalResolverSuite ‑ output cannot grant multiple OutputStreams for 1 file
com.tribbloids.spookystuff.utils.io.LocalResolverSuite ‑ output copyTo a new file
com.tribbloids.spookystuff.utils.io.LocalResolverSuite ‑ output copyTo overwrite an existing file
com.tribbloids.spookystuff.utils.io.LocalResolverSuite ‑ resolver is serializable
com.tribbloids.spookystuff.utils.io.LocalResolverSuite ‑ touch should be sequential
com.tribbloids.spookystuff.utils.locality.LocalityImplSuite ‑ ... even if keys overlap partially BroadcastLocalityImpl
com.tribbloids.spookystuff.utils.locality.LocalityImplSuite ‑ ... even if keys overlap partially BroadcastLocalityImpl.cogroupBase() is always left-outer
com.tribbloids.spookystuff.utils.locality.LocalityImplSuite ‑ ... even if keys overlap partially IndexingLocalityImpl
com.tribbloids.spookystuff.utils.locality.LocalityImplSuite ‑ ... even if keys overlap partially IndexingLocalityImpl.cogroupBase() is always left-outer
com.tribbloids.spookystuff.utils.locality.LocalityImplSuite ‑ ... even if keys overlap partially SortingLocalityImpl
com.tribbloids.spookystuff.utils.locality.LocalityImplSuite ‑ ... even if keys overlap partially SortingLocalityImpl.cogroupBase() is always left-outer
com.tribbloids.spookystuff.utils.locality.LocalityImplSuite ‑ ... even if the first operand is not serializable BroadcastLocalityImpl
com.tribbloids.spookystuff.utils.locality.LocalityImplSuite ‑ ... even if the first operand is not serializable IndexingLocalityImpl
com.tribbloids.spookystuff.utils.locality.LocalityImplSuite ‑ ... even if the first operand is not serializable SortingLocalityImpl
com.tribbloids.spookystuff.utils.locality.LocalityImplSuite ‑ Spike: when 2 RDDs are cogrouped the first operand containing unserializable objects will not trigger an exceptionif it has a partitioner
com.tribbloids.spookystuff.utils.locality.LocalityImplSuite ‑ Spike: when 2 RDDs are cogrouped the first operand will NOT move if it has a partitioner
com.tribbloids.spookystuff.utils.locality.LocalityImplSuite ‑ Spike: when 2 RDDs are cogrouped the second operand containing unserializable objects will not trigger an exceptionif it has a partitioner
com.tribbloids.spookystuff.utils.locality.LocalityImplSuite ‑ Spike: when 2 RDDs are cogrouped the second operand will NOT move if it has a partitioner
com.tribbloids.spookystuff.utils.locality.LocalityImplSuite ‑ cogroupBase() can preserve both locality and in-partition orders BroadcastLocalityImpl
com.tribbloids.spookystuff.utils.locality.LocalityImplSuite ‑ cogroupBase() can preserve both locality and in-partition orders IndexingLocalityImpl
com.tribbloids.spookystuff.utils.locality.LocalityImplSuite ‑ cogroupBase() can preserve both locality and in-partition orders SortingLocalityImpl
com.tribbloids.spookystuff.utils.locality.LocalityImplSuite ‑ each partition of the first operand of cogroup should not move, but elements are shuffled
com.tribbloids.spookystuff.utils.serialization.AssertSerializableSuite ‑ IllegalArgumentException should be WeaklySerializable
com.tribbloids.spookystuff.utils.serialization.BeforeAndAfterShippingSuite ‑ can serialize container
com.tribbloids.spookystuff.utils.serialization.BeforeAndAfterShippingSuite ‑ can serialize self
com.tribbloids.spookystuff.utils.serialization.NOTSerializableSuite ‑ base class is serializable
com.tribbloids.spookystuff.utils.serialization.NOTSerializableSuite ‑ mxin will trigger a runtime error in closure cleaning
com.tribbloids.spookystuff.utils.serialization.NOTSerializableSuite ‑ when using JavaSerializer mixin will trigger a runtime error
com.tribbloids.spookystuff.utils.serialization.NOTSerializableSuite ‑ when using JavaSerializer subclass of a class that inherits mixin will trigger a runtime error

Check notice on line 0 in .github

See this annotation in the file changed.

@github-actions github-actions / Unit Test Results

924 tests found (test 554 to 924)

There are 924 tests, see "Raw output" for the list of tests 554 to 924.
Raw output
com.tribbloids.spookystuff.utils.serialization.NOTSerializableSuite ‑ when using JavaSerializer subclass of a trait that inherits mixin will trigger a runtime error
com.tribbloids.spookystuff.utils.serialization.NOTSerializableSuite ‑ when using KryoSerializer mixin will trigger a runtime error
com.tribbloids.spookystuff.utils.serialization.NOTSerializableSuite ‑ when using KryoSerializer subclass of a class that inherits mixin will trigger a runtime error
com.tribbloids.spookystuff.utils.serialization.NOTSerializableSuite ‑ when using KryoSerializer subclass of a trait that inherits mixin will trigger a runtime error
com.tribbloids.spookystuff.web.actions.TestBlock ‑ Try(Wget) can failsafe on malformed uri
com.tribbloids.spookystuff.web.actions.TestBlock ‑ loop without export won't need driver
com.tribbloids.spookystuff.web.actions.TestBlock ‑ try without export won't need driver
com.tribbloids.spookystuff.web.actions.TestBlock ‑ wayback time of loop should be identical to its last child supporting wayback
com.tribbloids.spookystuff.web.actions.TestPageFromBrowser ‑ empty page
com.tribbloids.spookystuff.web.actions.TestPageFromBrowser ‑ visit, save and load
com.tribbloids.spookystuff.web.actions.TestTrace_HtmlUnit ‑ TraceView.TreeNode.toString should have indentations of TreeNode
com.tribbloids.spookystuff.web.actions.TestTrace_HtmlUnit ‑ TraceView.autoSnapshot should append Snapshot to non-empty Trace that doesn't end with Export OR Block
com.tribbloids.spookystuff.web.actions.TestTrace_HtmlUnit ‑ TraceView.autoSnapshot should append Snapshot to non-empty Trace that has no output
com.tribbloids.spookystuff.web.actions.TestTrace_HtmlUnit ‑ TraceView.autoSnapshot should not modify empty Trace
com.tribbloids.spookystuff.web.actions.TestTrace_HtmlUnit ‑ css selector should work
com.tribbloids.spookystuff.web.actions.TestTrace_HtmlUnit ‑ dryrun should discard preceding actions when calculating Driverless action's backtrace
com.tribbloids.spookystuff.web.actions.TestTrace_HtmlUnit ‑ inject output names should change output doc names
com.tribbloids.spookystuff.web.actions.TestTrace_HtmlUnit ‑ sizzle selector should work
com.tribbloids.spookystuff.web.actions.TestTrace_HtmlUnit ‑ visit and snapshot
com.tribbloids.spookystuff.web.actions.TestTrace_HtmlUnit ‑ visit should handle corsera
com.tribbloids.spookystuff.web.actions.TestTrace_HtmlUnit ‑ visit, input submit and snapshot
com.tribbloids.spookystuff.web.actions.TestTrace_HtmlUnit_TaskLocal ‑ TraceView.TreeNode.toString should have indentations of TreeNode
com.tribbloids.spookystuff.web.actions.TestTrace_HtmlUnit_TaskLocal ‑ TraceView.autoSnapshot should append Snapshot to non-empty Trace that doesn't end with Export OR Block
com.tribbloids.spookystuff.web.actions.TestTrace_HtmlUnit_TaskLocal ‑ TraceView.autoSnapshot should append Snapshot to non-empty Trace that has no output
com.tribbloids.spookystuff.web.actions.TestTrace_HtmlUnit_TaskLocal ‑ TraceView.autoSnapshot should not modify empty Trace
com.tribbloids.spookystuff.web.actions.TestTrace_HtmlUnit_TaskLocal ‑ css selector should work
com.tribbloids.spookystuff.web.actions.TestTrace_HtmlUnit_TaskLocal ‑ dryrun should discard preceding actions when calculating Driverless action's backtrace
com.tribbloids.spookystuff.web.actions.TestTrace_HtmlUnit_TaskLocal ‑ inject output names should change output doc names
com.tribbloids.spookystuff.web.actions.TestTrace_HtmlUnit_TaskLocal ‑ sizzle selector should work
com.tribbloids.spookystuff.web.actions.TestTrace_HtmlUnit_TaskLocal ‑ visit and snapshot
com.tribbloids.spookystuff.web.actions.TestTrace_HtmlUnit_TaskLocal ‑ visit should handle corsera
com.tribbloids.spookystuff.web.actions.TestTrace_HtmlUnit_TaskLocal ‑ visit, input submit and snapshot
com.tribbloids.spookystuff.web.actions.TestTrace_PhantomJS ‑ TraceView.TreeNode.toString should have indentations of TreeNode
com.tribbloids.spookystuff.web.actions.TestTrace_PhantomJS ‑ TraceView.autoSnapshot should append Snapshot to non-empty Trace that doesn't end with Export OR Block
com.tribbloids.spookystuff.web.actions.TestTrace_PhantomJS ‑ TraceView.autoSnapshot should append Snapshot to non-empty Trace that has no output
com.tribbloids.spookystuff.web.actions.TestTrace_PhantomJS ‑ TraceView.autoSnapshot should not modify empty Trace
com.tribbloids.spookystuff.web.actions.TestTrace_PhantomJS ‑ css selector should work
com.tribbloids.spookystuff.web.actions.TestTrace_PhantomJS ‑ dryrun should discard preceding actions when calculating Driverless action's backtrace
com.tribbloids.spookystuff.web.actions.TestTrace_PhantomJS ‑ inject output names should change output doc names
com.tribbloids.spookystuff.web.actions.TestTrace_PhantomJS ‑ sizzle selector should work
com.tribbloids.spookystuff.web.actions.TestTrace_PhantomJS ‑ visit and snapshot
com.tribbloids.spookystuff.web.actions.TestTrace_PhantomJS ‑ visit should handle corsera
com.tribbloids.spookystuff.web.actions.TestTrace_PhantomJS ‑ visit, input submit and snapshot
com.tribbloids.spookystuff.web.actions.TestTrace_PhantomJS_TaskLocal ‑ TraceView.TreeNode.toString should have indentations of TreeNode
com.tribbloids.spookystuff.web.actions.TestTrace_PhantomJS_TaskLocal ‑ TraceView.autoSnapshot should append Snapshot to non-empty Trace that doesn't end with Export OR Block
com.tribbloids.spookystuff.web.actions.TestTrace_PhantomJS_TaskLocal ‑ TraceView.autoSnapshot should append Snapshot to non-empty Trace that has no output
com.tribbloids.spookystuff.web.actions.TestTrace_PhantomJS_TaskLocal ‑ TraceView.autoSnapshot should not modify empty Trace
com.tribbloids.spookystuff.web.actions.TestTrace_PhantomJS_TaskLocal ‑ css selector should work
com.tribbloids.spookystuff.web.actions.TestTrace_PhantomJS_TaskLocal ‑ dryrun should discard preceding actions when calculating Driverless action's backtrace
com.tribbloids.spookystuff.web.actions.TestTrace_PhantomJS_TaskLocal ‑ inject output names should change output doc names
com.tribbloids.spookystuff.web.actions.TestTrace_PhantomJS_TaskLocal ‑ sizzle selector should work
com.tribbloids.spookystuff.web.actions.TestTrace_PhantomJS_TaskLocal ‑ visit and snapshot
com.tribbloids.spookystuff.web.actions.TestTrace_PhantomJS_TaskLocal ‑ visit should handle corsera
com.tribbloids.spookystuff.web.actions.TestTrace_PhantomJS_TaskLocal ‑ visit, input submit and snapshot
com.tribbloids.spookystuff.web.actions.TestWayback ‑ Screenshot.waybackTo should work on cache
com.tribbloids.spookystuff.web.actions.TestWayback ‑ Snapshot.waybackTo should work on cache
com.tribbloids.spookystuff.web.actions.TestWayback ‑ Wget.waybackTo should work on cache
com.tribbloids.spookystuff.web.actions.WebActionSuite ‑ Click -> JSON
com.tribbloids.spookystuff.web.actions.WebActionSuite ‑ Click -> treeText
com.tribbloids.spookystuff.web.actions.WebActionSuite ‑ Click has an UDT
com.tribbloids.spookystuff.web.actions.WebActionSuite ‑ ClusterRetryImpl has an UDT
com.tribbloids.spookystuff.web.actions.WebActionSuite ‑ Loop -> JSON
com.tribbloids.spookystuff.web.actions.WebActionSuite ‑ Loop -> treeText
com.tribbloids.spookystuff.web.actions.WebActionSuite ‑ Wget has an UDT
com.tribbloids.spookystuff.web.actions.WebActionSuite ‑ a session with webDriver initialized will trigger errorDump, which should not be blocked by DocFilter
com.tribbloids.spookystuff.web.actions.WebActionSuite ‑ a session without webDriver initialized won't trigger errorDump
com.tribbloids.spookystuff.web.actions.WebActionSuite ‑ errorDump at the end of a series of actions should contains all backtraces
com.tribbloids.spookystuff.web.conf.DriverFactorySpec ‑ HtmlUnit can factoryReset
com.tribbloids.spookystuff.web.conf.DriverFactorySpec ‑ If the old driver is released, the second taskLocal DriverFactory.get() should yield the same driver
com.tribbloids.spookystuff.web.conf.DriverFactorySpec ‑ PhantomJS can factoryReset
com.tribbloids.spookystuff.web.conf.DriverFactorySpec ‑ PythonDriverFactory can factoryReset
com.tribbloids.spookystuff.web.conf.WebDriverSuite ‑ PhantomJS DriverFactory can degrade gracefully if remote URI is unreachable
org.apache.spark.ml.dsl.AppendSuite ‑ :-> Source is cast to union
org.apache.spark.ml.dsl.AppendSuite ‑ :-> Stage is cast to rebase
org.apache.spark.ml.dsl.AppendSuite ‑ <-: Source is cast to union
org.apache.spark.ml.dsl.AppendSuite ‑ <-: Stage is cast to rebase
org.apache.spark.ml.dsl.AppendSuite ‑ A :-> B :-> Source is associative
org.apache.spark.ml.dsl.AppendSuite ‑ A :-> B :-> detached Stage is associative
org.apache.spark.ml.dsl.AppendSuite ‑ A <-: B <-: Source is associative
org.apache.spark.ml.dsl.AppendSuite ‑ A <-: B <-: detached Stage is associative
org.apache.spark.ml.dsl.AppendSuite ‑ can automatically generate names
org.apache.spark.ml.dsl.AppendSuite ‑ pincer topology can be defined by A :-> B <-: A
org.apache.spark.ml.dsl.AppendSuite_PruneDownPath ‑ :-> Source is cast to union
org.apache.spark.ml.dsl.AppendSuite_PruneDownPath ‑ :-> Stage is cast to rebase
org.apache.spark.ml.dsl.AppendSuite_PruneDownPath ‑ <-: Source is cast to union
org.apache.spark.ml.dsl.AppendSuite_PruneDownPath ‑ <-: Stage is cast to rebase
org.apache.spark.ml.dsl.AppendSuite_PruneDownPath ‑ A :-> B :-> Source is associative
org.apache.spark.ml.dsl.AppendSuite_PruneDownPath ‑ A :-> B :-> detached Stage is associative
org.apache.spark.ml.dsl.AppendSuite_PruneDownPath ‑ A <-: B <-: Source is associative
org.apache.spark.ml.dsl.AppendSuite_PruneDownPath ‑ A <-: B <-: detached Stage is associative
org.apache.spark.ml.dsl.AppendSuite_PruneDownPath ‑ can automatically generate names
org.apache.spark.ml.dsl.AppendSuite_PruneDownPath ‑ pincer topology can be defined by A :-> B <-: A
org.apache.spark.ml.dsl.AppendSuite_PruneDownPathKeepRoot ‑ :-> Source is cast to union
org.apache.spark.ml.dsl.AppendSuite_PruneDownPathKeepRoot ‑ :-> Stage is cast to rebase
org.apache.spark.ml.dsl.AppendSuite_PruneDownPathKeepRoot ‑ <-: Source is cast to union
org.apache.spark.ml.dsl.AppendSuite_PruneDownPathKeepRoot ‑ <-: Stage is cast to rebase
org.apache.spark.ml.dsl.AppendSuite_PruneDownPathKeepRoot ‑ A :-> B :-> Source is associative
org.apache.spark.ml.dsl.AppendSuite_PruneDownPathKeepRoot ‑ A :-> B :-> detached Stage is associative
org.apache.spark.ml.dsl.AppendSuite_PruneDownPathKeepRoot ‑ A <-: B <-: Source is associative
org.apache.spark.ml.dsl.AppendSuite_PruneDownPathKeepRoot ‑ A <-: B <-: detached Stage is associative
org.apache.spark.ml.dsl.AppendSuite_PruneDownPathKeepRoot ‑ can automatically generate names
org.apache.spark.ml.dsl.AppendSuite_PruneDownPathKeepRoot ‑ pincer topology can be defined by A :-> B <-: A
org.apache.spark.ml.dsl.CompactionSuite ‑ DoNotCompact should work on Case1  ...
org.apache.spark.ml.dsl.CompactionSuite ‑ DoNotCompact should work on Case2  ...
org.apache.spark.ml.dsl.CompactionSuite ‑ PruneDownPath should work on Case1  ...
org.apache.spark.ml.dsl.CompactionSuite ‑ PruneDownPath should work on Case2  ...
org.apache.spark.ml.dsl.CompactionSuite ‑ PruneDownPathKeepRoot should work on Case1  ...
org.apache.spark.ml.dsl.CompactionSuite ‑ PruneDownPathKeepRoot should work on Case2  ...
org.apache.spark.ml.dsl.ComposeSuite ‑ A compose_> (PASSTHROUGH || Stage) rebase_> B is associative
org.apache.spark.ml.dsl.ComposeSuite ‑ Compose throws an exception when operand2 is type inconsistent with output of operand1 as a Flow
org.apache.spark.ml.dsl.ComposeSuite ‑ Compose throws an exception when operand2 is type inconsistent with output of operand1 as a Source
org.apache.spark.ml.dsl.ComposeSuite ‑ Compose works when operand2 is type consistent
org.apache.spark.ml.dsl.ComposeSuite ‑ PASSTHROUGH compose_> Stage doesn't change the flow
org.apache.spark.ml.dsl.ComposeSuite ‑ Union throws an exception when a stage in result has incompatible number of inputCols
org.apache.spark.ml.dsl.ComposeSuite ‑ Union throws an exception when a stage in result is type inconsistent
org.apache.spark.ml.dsl.ComposeSuite ‑ compose_< Source doesn't work
org.apache.spark.ml.dsl.ComposeSuite ‑ compose_< can append a stage to 2 heads
org.apache.spark.ml.dsl.ComposeSuite ‑ compose_< can append a stage to 2 heads from 1 tail
org.apache.spark.ml.dsl.ComposeSuite ‑ compose_< can append a stage to merged heads
org.apache.spark.ml.dsl.ComposeSuite ‑ compose_< can bypass Source of downstream
org.apache.spark.ml.dsl.ComposeSuite ‑ compose_> (PASSTHROUGH || Stage) generates 2 heads
org.apache.spark.ml.dsl.ComposeSuite ‑ compose_> PASSTHROUGH doesn't change the flow
org.apache.spark.ml.dsl.ComposeSuite ‑ compose_> Source doesn't work
org.apache.spark.ml.dsl.ComposeSuite ‑ compose_> can append a stage to 2 heads
org.apache.spark.ml.dsl.ComposeSuite ‑ compose_> can append a stage to 2 heads from 1 tail
org.apache.spark.ml.dsl.ComposeSuite ‑ compose_> can append a stage to merged heads
org.apache.spark.ml.dsl.ComposeSuite ‑ compose_> can bypass Source of downstream
org.apache.spark.ml.dsl.ComposeSuite ‑ declare API is equally effective
org.apache.spark.ml.dsl.ComposeSuite ‑ result of compose_< can be the first operand of compose_>
org.apache.spark.ml.dsl.ComposeSuite ‑ result of compose_> can be the first operand of compose_<
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPath ‑ A compose_> (PASSTHROUGH || Stage) rebase_> B is associative
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPath ‑ Compose throws an exception when operand2 is type inconsistent with output of operand1 as a Flow
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPath ‑ Compose throws an exception when operand2 is type inconsistent with output of operand1 as a Source
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPath ‑ Compose works when operand2 is type consistent
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPath ‑ PASSTHROUGH compose_> Stage doesn't change the flow
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPath ‑ Union throws an exception when a stage in result has incompatible number of inputCols
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPath ‑ Union throws an exception when a stage in result is type inconsistent
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPath ‑ compose_< Source doesn't work
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPath ‑ compose_< can append a stage to 2 heads
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPath ‑ compose_< can append a stage to 2 heads from 1 tail
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPath ‑ compose_< can append a stage to merged heads
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPath ‑ compose_< can bypass Source of downstream
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPath ‑ compose_> (PASSTHROUGH || Stage) generates 2 heads
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPath ‑ compose_> PASSTHROUGH doesn't change the flow
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPath ‑ compose_> Source doesn't work
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPath ‑ compose_> can append a stage to 2 heads
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPath ‑ compose_> can append a stage to 2 heads from 1 tail
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPath ‑ compose_> can append a stage to merged heads
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPath ‑ compose_> can bypass Source of downstream
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPath ‑ declare API is equally effective
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPath ‑ result of compose_< can be the first operand of compose_>
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPath ‑ result of compose_> can be the first operand of compose_<
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPathKeepRoot ‑ A compose_> (PASSTHROUGH || Stage) rebase_> B is associative
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPathKeepRoot ‑ Compose throws an exception when operand2 is type inconsistent with output of operand1 as a Flow
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPathKeepRoot ‑ Compose throws an exception when operand2 is type inconsistent with output of operand1 as a Source
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPathKeepRoot ‑ Compose works when operand2 is type consistent
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPathKeepRoot ‑ PASSTHROUGH compose_> Stage doesn't change the flow
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPathKeepRoot ‑ Union throws an exception when a stage in result has incompatible number of inputCols
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPathKeepRoot ‑ Union throws an exception when a stage in result is type inconsistent
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPathKeepRoot ‑ compose_< Source doesn't work
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPathKeepRoot ‑ compose_< can append a stage to 2 heads
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPathKeepRoot ‑ compose_< can append a stage to 2 heads from 1 tail
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPathKeepRoot ‑ compose_< can append a stage to merged heads
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPathKeepRoot ‑ compose_< can bypass Source of downstream
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPathKeepRoot ‑ compose_> (PASSTHROUGH || Stage) generates 2 heads
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPathKeepRoot ‑ compose_> PASSTHROUGH doesn't change the flow
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPathKeepRoot ‑ compose_> Source doesn't work
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPathKeepRoot ‑ compose_> can append a stage to 2 heads
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPathKeepRoot ‑ compose_> can append a stage to 2 heads from 1 tail
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPathKeepRoot ‑ compose_> can append a stage to merged heads
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPathKeepRoot ‑ compose_> can bypass Source of downstream
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPathKeepRoot ‑ declare API is equally effective
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPathKeepRoot ‑ result of compose_< can be the first operand of compose_>
org.apache.spark.ml.dsl.ComposeSuite_PruneDownPathKeepRoot ‑ result of compose_> can be the first operand of compose_<
org.apache.spark.ml.dsl.DFDReadWriteSuite ‑ Flow can be serialized into JSON and back
org.apache.spark.ml.dsl.DFDReadWriteSuite ‑ Flow can be serialized into XML and back
org.apache.spark.ml.dsl.DFDReadWriteSuite ‑ Pipeline can be saved and loaded
org.apache.spark.ml.dsl.DFDReadWriteSuite ‑ PipelineModel can be saved and loaded
org.apache.spark.ml.dsl.DFDSuite ‑ Flow can build Pipeline
org.apache.spark.ml.dsl.DFDSuite ‑ Flow can build PipelineModel
org.apache.spark.ml.dsl.DFDSuite ‑ If adaptation = FailFast_TypeUnsafe, Flow can still build a full pipeline when some of the sources have inconsistent type
org.apache.spark.ml.dsl.DFDSuite ‑ If adaptation = Force, Flow can still build a full pipeline when some of the sources are missing
org.apache.spark.ml.dsl.DFDSuite ‑ If adaptation = Force, Flow can still build a full pipeline when some of the sources have inconsistent type
org.apache.spark.ml.dsl.DFDSuite ‑ If adaptation = IgnoreIrrelevant, Flow can build a full pipeline given a valid schema evidence
org.apache.spark.ml.dsl.DFDSuite ‑ If adaptation = IgnoreIrrelevant, Flow can build an incomplete pipeline when some of the sources are missing
org.apache.spark.ml.dsl.DFDSuite ‑ If adaptation = IgnoreIrrelevant, Flow can build an incomplete pipeline when some of the sources have inconsistent type
org.apache.spark.ml.dsl.DFDSuite ‑ If adaptation = IgnoreIrrelevant_TypeUnsafe, Flow can still build a full pipeline when some of the sources have inconsistent type
org.apache.spark.ml.dsl.DFDSuite ‑ If adaption = FailFast, throw an exception when some of the sources are missing
org.apache.spark.ml.dsl.DFDSuite ‑ If adaption = FailFast, throw an exception when some of the sources have inconsistent type
org.apache.spark.ml.dsl.DFDSuite ‑ If adaption = IgnoreIrrelevant_ValidateSchema, Flow can build an incomplete pipeline when some of the sources are missing
org.apache.spark.ml.dsl.DFDSuite ‑ If adaption = IgnoreIrrelevant_ValidateSchema, throw an exception when some of the sources have inconsistent type
org.apache.spark.ml.dsl.DFDSuite ‑ Pipeline can be visualized as ASCII art
org.apache.spark.ml.dsl.DFDSuite ‑ Pipeline can be visualized as ASCII art backwards
org.apache.spark.ml.dsl.MapHeadSuite ‑ mapHead_< Source doesn't work
org.apache.spark.ml.dsl.MapHeadSuite ‑ mapHead_< can append to 2 heads
org.apache.spark.ml.dsl.MapHeadSuite ‑ mapHead_< can generate 2 stage replicas and append to 2 heads
org.apache.spark.ml.dsl.MapHeadSuite ‑ mapHead_< can generate 2 stage replicas and append to 2 selected
org.apache.spark.ml.dsl.MapHeadSuite ‑ mapHead_< won't remove Source of downstream if it's in tails of both side
org.apache.spark.ml.dsl.MapHeadSuite ‑ mapHead_> Source doesn't work
org.apache.spark.ml.dsl.MapHeadSuite ‑ mapHead_> can append to 2 heads
org.apache.spark.ml.dsl.MapHeadSuite ‑ mapHead_> can generate 2 stage replicas and append to 2 heads
org.apache.spark.ml.dsl.MapHeadSuite ‑ mapHead_> can generate 2 stage replicas and append to 2 selected
org.apache.spark.ml.dsl.MapHeadSuite ‑ mapHead_> won't remove Source of downstream if it's in tails of both side
org.apache.spark.ml.dsl.MapHeadSuite_PruneDownPath ‑ mapHead_< Source doesn't work
org.apache.spark.ml.dsl.MapHeadSuite_PruneDownPath ‑ mapHead_< can append to 2 heads
org.apache.spark.ml.dsl.MapHeadSuite_PruneDownPath ‑ mapHead_< can generate 2 stage replicas and append to 2 heads
org.apache.spark.ml.dsl.MapHeadSuite_PruneDownPath ‑ mapHead_< can generate 2 stage replicas and append to 2 selected
org.apache.spark.ml.dsl.MapHeadSuite_PruneDownPath ‑ mapHead_< won't remove Source of downstream if it's in tails of both side
org.apache.spark.ml.dsl.MapHeadSuite_PruneDownPath ‑ mapHead_> Source doesn't work
org.apache.spark.ml.dsl.MapHeadSuite_PruneDownPath ‑ mapHead_> can append to 2 heads
org.apache.spark.ml.dsl.MapHeadSuite_PruneDownPath ‑ mapHead_> can generate 2 stage replicas and append to 2 heads
org.apache.spark.ml.dsl.MapHeadSuite_PruneDownPath ‑ mapHead_> can generate 2 stage replicas and append to 2 selected
org.apache.spark.ml.dsl.MapHeadSuite_PruneDownPath ‑ mapHead_> won't remove Source of downstream if it's in tails of both side
org.apache.spark.ml.dsl.MapHeadSuite_PruneDownPathKeepRoot ‑ mapHead_< Source doesn't work
org.apache.spark.ml.dsl.MapHeadSuite_PruneDownPathKeepRoot ‑ mapHead_< can append to 2 heads
org.apache.spark.ml.dsl.MapHeadSuite_PruneDownPathKeepRoot ‑ mapHead_< can generate 2 stage replicas and append to 2 heads
org.apache.spark.ml.dsl.MapHeadSuite_PruneDownPathKeepRoot ‑ mapHead_< can generate 2 stage replicas and append to 2 selected
org.apache.spark.ml.dsl.MapHeadSuite_PruneDownPathKeepRoot ‑ mapHead_< won't remove Source of downstream if it's in tails of both side
org.apache.spark.ml.dsl.MapHeadSuite_PruneDownPathKeepRoot ‑ mapHead_> Source doesn't work
org.apache.spark.ml.dsl.MapHeadSuite_PruneDownPathKeepRoot ‑ mapHead_> can append to 2 heads
org.apache.spark.ml.dsl.MapHeadSuite_PruneDownPathKeepRoot ‑ mapHead_> can generate 2 stage replicas and append to 2 heads
org.apache.spark.ml.dsl.MapHeadSuite_PruneDownPathKeepRoot ‑ mapHead_> can generate 2 stage replicas and append to 2 selected
org.apache.spark.ml.dsl.MapHeadSuite_PruneDownPathKeepRoot ‑ mapHead_> won't remove Source of downstream if it's in tails of both side
org.apache.spark.ml.dsl.SchemaAdaptationSuite ‑ cartesianProduct should work on empty list
org.apache.spark.ml.dsl.SchemaAdaptationSuite ‑ cartesianProduct should work on list
org.apache.spark.ml.dsl.SchemaAdaptationSuite ‑ cartesianProduct should work on list of empty sets
org.apache.spark.ml.dsl.TrieNodeSuite ‑ compact can merge single child parents
org.apache.spark.ml.dsl.TrieNodeSuite ‑ pruneUp can rename single children
org.apache.spark.ml.dsl.TrieNodeSuite ‑ reversed compact can minimize repeated names
org.apache.spark.ml.dsl.TrieNodeSuite ‑ reversed compact can minimize some names
org.apache.spark.ml.dsl.TrieNodeSuite ‑ reversed pruneUp can minimize names
org.apache.spark.ml.dsl.UDFTransformerSuite ‑ transformer can add new column
org.apache.spark.ml.dsl.UDFTransformerSuite ‑ transformer has consistent schema
org.apache.spark.ml.dsl.utils.DSLUtilsSuite ‑ methodName should return caller's name
org.apache.spark.ml.dsl.utils.NullSafetySuite ‑ CannotBeNull can only be converted from Some
org.apache.spark.ml.dsl.utils.NullSafetySuite ‑ String ? Var supports mutation
org.apache.spark.ml.dsl.utils.NullSafetySuite ‑ can be converted from option
org.apache.spark.ml.dsl.utils.NullSafetySuite ‑ can be converted from value
org.apache.spark.ml.dsl.utils.RecursiveEitherAsUnionToJSONSpike ‑ JSON <=> Union of arity 3
org.apache.spark.ml.dsl.utils.RecursiveEitherAsUnionToJSONSpike ‑ JSON <=> case class with Option[Union] of arity 3
org.apache.spark.ml.dsl.utils.RecursiveEitherAsUnionToJSONSpike ‑ JSON <=> case class with Union of arity 3
org.apache.spark.ml.dsl.utils.ScalaNameMixinSuite ‑ can process anonymous function dependent object
org.apache.spark.ml.dsl.utils.XMLWeakDeserializerSuite ‑ double to int
org.apache.spark.ml.dsl.utils.XMLWeakDeserializerSuite ‑ empty string to Object
org.apache.spark.ml.dsl.utils.XMLWeakDeserializerSuite ‑ empty string to Option[Map]
org.apache.spark.ml.dsl.utils.XMLWeakDeserializerSuite ‑ empty string to default constructor value
org.apache.spark.ml.dsl.utils.XMLWeakDeserializerSuite ‑ int array to int array
org.apache.spark.ml.dsl.utils.XMLWeakDeserializerSuite ‑ int to String
org.apache.spark.ml.dsl.utils.XMLWeakDeserializerSuite ‑ int to int array
org.apache.spark.ml.dsl.utils.XMLWeakDeserializerSuite ‑ int to int seq
org.apache.spark.ml.dsl.utils.XMLWeakDeserializerSuite ‑ int to int set
org.apache.spark.ml.dsl.utils.XMLWeakDeserializerSuite ‑ missing member to default constructor value
org.apache.spark.ml.dsl.utils.XMLWeakDeserializerSuite ‑ sanity test
org.apache.spark.ml.dsl.utils.XMLWeakDeserializerSuite ‑ string to int
org.apache.spark.ml.dsl.utils.XMLWeakDeserializerSuite ‑ string to int array
org.apache.spark.ml.dsl.utils.XMLWeakDeserializerSuite ‑ string to int seq
org.apache.spark.ml.dsl.utils.XMLWeakDeserializerSuite ‑ string to int set
org.apache.spark.ml.dsl.utils.data.EAVSuite ‑ nested <=> JSON
org.apache.spark.ml.dsl.utils.data.EAVSuite ‑ tryGetEnum can convert String to Enumeration
org.apache.spark.ml.dsl.utils.data.EAVSuite ‑ wellformed <=> JSON
org.apache.spark.ml.dsl.utils.data.EAVSuite ‑ withNull <=> JSON
org.apache.spark.ml.dsl.utils.data.Json4sSpike ‑ encode/decode ListMap
org.apache.spark.ml.dsl.utils.data.Json4sSpike ‑ encode/decode Path
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSpike ‑ TypeTag from Type can be serializable
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSpike ‑ can create ClassTag for Array[T]
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSpike ‑ can get TypeTag
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSpike ‑ can get another TypeTag
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSpike ‑ can reflect anon class
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSpike ‑ can reflect lambda
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSpike ‑ scala reflection can be used to get type of Array[String].headOption
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From Array[scala.Tuple2] has a mirror
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From Array[scala.Tuple2] is Serializable
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[(String, Int)] ... even if created from raw Type
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[(String, Int)] has a mirror
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[(String, Int)] is Serializable
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[Array[(String, Int)]] ... even if created from raw Type
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[Array[(String, Int)]] has a mirror
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[Array[(String, Int)]] is Serializable
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[Array[Double]] ... even if created from raw Type
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[Array[Double]] has a mirror
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[Array[Double]] is Serializable
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[Array[Int]] ... even if created from raw Type
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[Array[Int]] has a mirror
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[Array[Int]] is Serializable
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[Array[String]] ... even if created from raw Type
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[Array[String]] has a mirror
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[Array[String]] is Serializable
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[Double] ... even if created from raw Type
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[Double] has a mirror
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[Double] is Serializable
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[Int] ... even if created from raw Type
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[Int] has a mirror
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[Int] is Serializable
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[Seq[(String, Int)]] ... even if created from raw Type
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[Seq[(String, Int)]] has a mirror
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[Seq[(String, Int)]] is Serializable
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[Seq[String]] ... even if created from raw Type
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[Seq[String]] has a mirror
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[Seq[String]] is Serializable
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[String] ... even if created from raw Type
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[String] has a mirror
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[String] is Serializable
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[com.tribbloids.spookystuff.relay.TestBeans.Multipart.type] ... even if created from raw Type
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[com.tribbloids.spookystuff.relay.TestBeans.Multipart.type] has a mirror
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[com.tribbloids.spookystuff.relay.TestBeans.Multipart.type] is Serializable
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[com.tribbloids.spookystuff.relay.TestBeans.Multipart] ... even if created from raw Type
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[com.tribbloids.spookystuff.relay.TestBeans.Multipart] has a mirror
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[com.tribbloids.spookystuff.relay.TestBeans.Multipart] is Serializable
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[com.tribbloids.spookystuff.relay.TestBeans.User] ... even if created from raw Type
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[com.tribbloids.spookystuff.relay.TestBeans.User] has a mirror
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[com.tribbloids.spookystuff.relay.TestBeans.User] is Serializable
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[java.sql.Timestamp] ... even if created from raw Type
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[java.sql.Timestamp] has a mirror
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From TypeTag[java.sql.Timestamp] is Serializable
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From scala.Tuple2 has a mirror
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From scala.Tuple2 is Serializable
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From scala.collection.immutable.Seq has a mirror
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ From scala.collection.immutable.Seq is Serializable
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [(String, Int) <: TypeTag] ==> Class ==> ?
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [(String, Int) <: TypeTag] ==> ClassTag ==> ?
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [(String, Int) <: TypeTag] ==> DataType ==> ?
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [ArrayType(DoubleType,false) <: DataType] <==> TypeTag
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [ArrayType(IntegerType,false) <: DataType] <==> TypeTag
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [ArrayType(StringType,true) <: DataType] <==> TypeTag
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [Array[(String, Int)] <: TypeTag] ==> Class ==> ?
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [Array[(String, Int)] <: TypeTag] ==> ClassTag ==> ?
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [Array[(String, Int)] <: TypeTag] ==> DataType ==> ?
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [Array[Double] <: TypeTag] <==> DataType
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [Array[Double] <: TypeTag] ==> Class ==> ?
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [Array[Double] <: TypeTag] ==> ClassTag ==> ?
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [Array[Int] <: TypeTag] <==> DataType
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [Array[Int] <: TypeTag] ==> Class ==> ?
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [Array[Int] <: TypeTag] ==> ClassTag ==> ?
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [Array[String] <: TypeTag] <==> DataType
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [Array[String] <: TypeTag] ==> Class ==> ?
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [Array[String] <: TypeTag] ==> ClassTag ==> ?
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [Double <: TypeTag] <==> Class
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [Double <: TypeTag] <==> ClassTag
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [Double <: TypeTag] <==> DataType
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [DoubleType <: DataType] <==> TypeTag
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [Int <: TypeTag] <==> Class
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [Int <: TypeTag] <==> ClassTag
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [Int <: TypeTag] <==> DataType
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [IntegerType <: DataType] <==> TypeTag
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [Seq[(String, Int)] <: TypeTag] ==> Class ==> ?
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [Seq[(String, Int)] <: TypeTag] ==> ClassTag ==> ?
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [Seq[(String, Int)] <: TypeTag] ==> DataType ==> ?
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [Seq[String] <: TypeTag] ==> Class ==> ?
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [Seq[String] <: TypeTag] ==> ClassTag ==> ?
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [Seq[String] <: TypeTag] ==> DataType ==> ?
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [String <: TypeTag] <==> Class
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [String <: TypeTag] <==> ClassTag
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [String <: TypeTag] <==> DataType
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [StringType <: DataType] <==> TypeTag
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [TimestampType <: DataType] <==> TypeTag
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [com.tribbloids.spookystuff.relay.TestBeans.Multipart <: TypeTag] <==> Class
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [com.tribbloids.spookystuff.relay.TestBeans.Multipart <: TypeTag] <==> ClassTag
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [com.tribbloids.spookystuff.relay.TestBeans.Multipart.type <: TypeTag] <==> Class
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [com.tribbloids.spookystuff.relay.TestBeans.Multipart.type <: TypeTag] <==> ClassTag
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [com.tribbloids.spookystuff.relay.TestBeans.User <: TypeTag] <==> Class
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [com.tribbloids.spookystuff.relay.TestBeans.User <: TypeTag] <==> ClassTag
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [java.sql.Timestamp <: TypeTag] <==> Class
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [java.sql.Timestamp <: TypeTag] <==> ClassTag
org.apache.spark.ml.dsl.utils.refl.TypeMagnetSuite ‑ [java.sql.Timestamp <: TypeTag] <==> DataType
org.apache.spark.ml.dsl.utils.refl.TypeSpike ‑ List type equality
org.apache.spark.ml.dsl.utils.refl.TypeSpike ‑ Map type equality
org.apache.spark.ml.dsl.utils.refl.UnReifiedObjectTypeSuite ‑ toString
org.apache.spark.rdd.spookystuff.FallbackIteratorSuite ‑ can consume from 1 iterator
org.apache.spark.rdd.spookystuff.FallbackIteratorSuite ‑ can consume from 2 iterators
org.apache.spark.rdd.spookystuff.ScalaTestJUnitRunnerSpike$$anon$1 ‑ test 1
org.apache.spark.rdd.spookystuff.ScalaTestJUnitRunnerSpike$$anon$2 ‑ test 1