Skip to content

Semantic bug #49

Open
Open
@schneems

Description

@schneems

Not technically a bug, but perhaps unexpected behavior. As the updating is the only atomic operation, items within the update block may not be correctly preserved. For example appending an element to an array would get done in parallel and values will get over-written:

require 'atomic'
array = Atomic.new([])

threads = []
def insert_into_array(array, value, threads)
  max = 100_000
  threads <<  Thread.new do
    max.times{
      array.update {|v|
        v << value
      }
    }
  end
end


insert_into_array(array, 1, threads)
insert_into_array(array, 2, threads)
insert_into_array(array, 3, threads)

threads.map(&:join)
puts array.value.size

When I run this in JRuby i get different values every time such as 294602.

Again, this is technically correct result, but by the semantics of update and the block syntax I would be lead to believe that update also acted like Mutex#synchronize so the results of the above would always return the same value.

Do you think there's anything we can do to make this experience better or this result more expected?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions