-
I like to slowly adopt new ideas and I have a pet project that might be a great example for documenting "real world" use cases of Airstream for frontend libraries beyond laminar. I'm using the infamous "papaparse" js library for parsing CSV. It has a normal parse mode that reads the entire csv into memory, which can handily return a Scala Then, there's a fancy Stream option, that let's you step through the rows of csv one by one. For this, I wasn't sure if Scala Also, the Airstream docs are great but I am definitely hopeful someone can recommend a short and sweet approach that doesn't demand too much understanding (yet). In time I will get the concepts down, but for now I'd appreciate quick and dirty solutions to see Airstream in action with how it can be applied to common scenarios like this. Here's an example papa parse facade and helper function for calling it to parse data as a @js.native
@JSImport("papaparse", JSImport.Namespace)
object PapaParse extends js.Object {
def parse(url: File, c: PapaConfig): Unit = js.native
}
object Papa {
def stream(file: File, header: Boolean, dynamicTyping: Boolean, transform: (String, String) => String): Future[js.Array[js.Object]] = {
val p = Promise[js.Array[js.Object]]()
PapaParse.parse(
file,
PapaConfig(
header = true,
dynamicTyping = false,
transform = transform,
step = (results: js.Object, parser: js.Object) => {
// fancy stream data
println(s"result is: ${JSON.stringify(results)}")
// complete a promise for each row of results?...
},
error = (err: js.Object) => {
println(s"error: ${JSON.stringify(err)}")
p.failure(new Exception(JSON.stringify(err)))
},
complete = (result: js.Object) => {
println(s"result is: ${JSON.stringify(result)}")
// regular parse result we are returning a future for
p.success(js.Array(result))
}
)
)
p.future
} |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 2 replies
-
Update: I did some more reading and realized for ReactJS, Airstream would require I manually render the component. So I did not at first realize that and it may be trying to mix two different architectures too much for my taste. I may need to reapproach Airstream if I get a chance to use Laminar. |
Beta Was this translation helpful? Give feedback.
-
Your "data pipeline" will have two terminations: the source and the sink. At the source, you want to get events into Airstream. That's easy, just create a custom stream: val (rowStream, nextRow) = EventStream.withCallback[CsvRow] and push events to it inside PapaParse's step = (results: js.Object, parser: js.Object) => {
nextRow(makeCsvRow(results))
} Now that you have a val resultSignal: Signal[Result] = rowStream.foldLeft(initial = Result.empty){ (acc, nextRow) => acc.updateWithRow(nextRow) } Finally, for the sink, you want to put your transformed data into React I guess? The Laminar video has a chapter on integrating Laminar components into React, and the approach here is similar, you need to:
That's it, implement Now, if you don't want any rendering, if you just want to do something with |
Beta Was this translation helpful? Give feedback.
-
I haven't really used web workers in JS so take what I say with a grain of
salt.
Web workers are not like threads in JVM, you don't get shared memory, just
the ability to pass messages between workers and the main "thread".
If you want Airstream events to pass from the main thread into a worker,
you'll need to transport them via this postPessage mechanism. That means
encoding your event values into JSON/string and creating a separate stream
/ event bus inside the worker that will listen to incoming messages and
decode them.
And if you want to run an instance of Airstream itself inside a web worker,
you won't have access to the DOM so you can't use Laminar in there which
means you can't use owners provided by Laminar inside a web worker, you'll
need to manage them yourself manually.
Otherwise I think stuff should still work, it will just be clunky to use
due to the above.
|
Beta Was this translation helpful? Give feedback.
Your "data pipeline" will have two terminations: the source and the sink.
At the source, you want to get events into Airstream. That's easy, just create a custom stream:
and push events to it inside PapaParse's
step
function:Now that you have a
rowStream
, you can map it or apply any other observable transformations. Suppose you want to accumulate some state:Finally, for the sink, you want to put your transformed data int…