Skip to content

Releases: datastax/spark-cassandra-connector

Release 1.0.4

07 Nov 10:18
Compare
Choose a tag to compare

1.0.4

  • Synchronized TypeConverter.forType methods to workaround some Scala 2.10
    reflection thread-safety problems (#235)
  • Synchronized computation of TypeTags in TypeConverter#targetTypeTag,.
    ColumnType#scalaTypeTag methods and other places to workaround some of.
    Scala 2.10 reflection thread-safety problems (#364)
  • Downgraded Guava to version 14.
    Upgraded Java driver to 2.0.7.
    Upgraded Cassandra to 2.0.11. (#366)
  • Made SparkContext variable transient in SparkContextFunctions (#373)
  • Fixed saving to tables with uppercase column names (#377)
  • Fixed saving collections of Tuple1 (#420)

Release 1.1.0 beta 1

27 Oct 19:10
Compare
Choose a tag to compare
Release 1.1.0 beta 1 Pre-release
Pre-release

Changes since 1.1.0-alpha4:

1.1.0 beta 1

  • Redesigned Java API, some refactorings (#300)
  • Simplified AuthConf - more responsibility on CassandraConnectionFactory

Preview release 1.1.0 alpha 4

16 Oct 16:08
Compare
Choose a tag to compare
Pre-release

Changes:

  • Use asynchronous prefetching of multi-page ResultSets in CassandraRDD
    to reduce waiting for Cassandra query results.
  • Make token range start and end be parameters of the query, not part of the query
    template to reduce the number of statements requiring preparation.
  • Added type converter for GregorianCalendar (#334)

Release 1.0.3

15 Oct 12:24
Compare
Choose a tag to compare

1.0.3

  • Fixed handling of Cassandra rpc_address set to 0.0.0.0 (#332)

Preview release 1.1.0 alpha 3

10 Oct 16:51
Compare
Choose a tag to compare
Pre-release

Changes:

  • Pluggable mechanism for obtaining connections to Cassandra
    Ability to pass custom CassandraConnector to CassandraRDDs (#192)
  • Provided a row reader which allows to create RDDs of pairs of objects as well
    as RDDs of simple objects handled by type converter directly;
    added meaningful compiler messages when invalid type was provided (#88)
  • Fixed serialization problem in CassandraSQLContext by making conf transient (#310)
  • Cleaned up the SBT assembly task and added build documentation (#315)

Release 1.0.2

10 Oct 16:31
Compare
Choose a tag to compare

Changes:

  • Fixed batch counter columns updates (#234, #316)
  • Expose both rpc addresses and local addresses of cassandra nodes in partition
    preferred locations (#325)
  • Cleaned up the SBT assembly task and added build documentation
    (backport of #315)

Release 1.0.1

08 Oct 14:12
Compare
Choose a tag to compare

Changes:

  • Add logging of error message when asynchronous task fails in AsyncExecutor.
    (#265)
  • Fix connection problems with fetching token ranges from hosts with
    rpc_address different than listen_address.
    Log host address(es) and ports on connection failures.
    Close thrift transport if connection fails for some reason after opening the transport,
    e.g. authentication failure.
  • Upgrade cassandra driver to 2.0.6.

Preview release 1.1.0 alpha 2

02 Oct 11:52
Compare
Choose a tag to compare
Pre-release

Changes:

  • Upgraded Apache Spark to 1.1.0.
  • Upgraded to be Cassandra 2.1.0 and Cassandra 2.0 compatible.
  • Added spark.cassandra.connection.local_dc option
  • Added spark.cassandra.connection.timeout_ms option
  • Added spark.cassandra.read.timeout_ms option
  • Added support for SparkSQL (#197)
  • Fixed problems with saving DStreams to Cassandra directly (#280)

Preview release 1.1.0 alpha 1

23 Sep 07:58
Compare
Choose a tag to compare
Pre-release

Warning: This is a preview release - the API might change.

Version 1.1.0 is intended to be compatible with Spark 1.0.2 and Spark 1.1.x, as well as Cassandra 2.0 and Cassandra 2.1.

Changes from 1.0.0:

  • Add an ./sbt/sbt script (like with spark) so people don't need to install sbt
  • Replace internal spark Logging with own class (#245)
  • Accept partition key predicates in CassandraRDD#where. (#37)
  • Add indexedColumn to ColumnDef (#122)
  • Upgrade Spark to version 1.0.2.
  • Removed deprecated toArray, replaced with collect.
  • Updated imports to org.apache.spark.streaming.receiver
    and import org.apache.spark.streaming.receiver.ActorHelper
  • Updated streaming demo and spec for Spark 1.0.2 behavior compatibility
  • Added new StreamingEvent types for Spark 1.0.2 Receiver readiness
  • Added the following Spark Streaming dependencies to the demos module:
    Kafka, Twitter, ZeroMQ
  • Added embedded Kafka and ZooKeeper servers for the Kafka Streaming demo
    • keeping non private for user prototyping
  • Added new Kafka Spark Streaming demo which reads from Kafka
    and writes to Cassandra (Twitter and ZeroMQ are next)
  • Added new 'embedded' module
    • Refactored the 'connector' module's IT SparkRepl, CassandraServer and CassandraServerRunner as
      well as 'demos' EmbeddedKafka and EmbeddedZookeeper to the 'embedded' module. This allows the 'embedded'
      module to be used as a dependency by the 'connector' IT tests, demos, and user local quick prototyping
      without requiring a Spark and Cassandra Cluster, local or remote, to get started.

Release 1.0.0

18 Sep 08:36
Compare
Choose a tag to compare

First stable release.