Sharing or accessing large immutable data across Akka actors -


i'm working on application lot of number crunching on single immutable data structure. (a collection of large arrays--essentially big matrix. let's typical size 200 columns 100,000 rows of doubles.) lot of computation can parallelized in different ways, , leverage akka actor model tackle problem.

i'm worried having pass matrix (or parts of it) around in messages because think involve whole lot of copying , serialization.

am over-thinking problem? (that is, if supervisor actor passes of arrays of matrix subordinates, akka runtime smart enough pass reference--assuming kept on same jvm--as opposed serializing message passing?)

i guess simpler way of asking question is: should 1 avoid situation massive data structures passed in messages?

just reiterate: data in question , totally immutable. never change.

akka serialize messages if send remote actor (or, extension, cluster node). remote doesn't mean, proper networking involved; can different jvm on same machine (over loopback interface). if have 1 actor system on 1 jvm without remoting, reference passed, in same manner if put message on queue , have different thread read it.


Comments