You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
FSharp.Control.AsyncSeq & the Async Streaming Ecosystem β February 2026
π¦ Repository Pulse
The repo has been active with a flurry of agentic CI automation added this week: an AI co-maintainer workflow (running every 4 hours) and a daily performance-improver agent were both merged. The latest stable release is v3.3.1, which delivered:
A new mapAsyncUnorderedParallel function for improved parallel throughput
Significant performance wins on iterAsync, iteriAsync, collect, mapAsync, and unfoldAsync
BenchmarkDotNet infrastructure to track performance systematically
The recently merged PR #204 renamed bufferByCount β chunkBySize, aligning with .NET naming conventions (Seq.chunkBySize).
Open PR to watch:#214 β Expose ofAsyncEnum/toAsyncEnum on netstandard2.0. The root cause was a mismatch between the .fsi signature file (gating behind NETSTANDARD2_1) and the .fs implementation (which already supports NETSTANDARD || NET). A one-line .fsi fix could resolve the long-standing issue #173.
The IAsyncEnumerable naming collision (issue #159) is particularly subtle: because this library defines its own FSharp.Control.IAsyncEnumerable<'T> with different semantics (using Async<'T option> for MoveNext rather than ValueTask(bool)), users who open both FSharp.Control.AsyncSeq and FSharp.Control.TaskSeq namespaces can get confusing type errors. The long-term answer is likely a major-version deprecation.
The tryFirst / Async.Choice idea (#205) is genuinely interesting: when elements are produced by parallel async computations, returning the first available rather than the first in order better matches the parallel mental model. The analogy to Async.Choice is apt.
π Ecosystem & Competitive Landscape
FSharp.Control.TaskSeq (v0.4.0, March 2024)
The main sibling/competitor in the F# async-sequence space. TaskSeq implements IAsyncEnumerable<'T> from the .NET BCL using resumable state machines (the same engine as task { }), with ValueTask for performance. The choosing guide summarises the trade-offs:
AsyncSeq
TaskSeq
Async model
F# Async(T)
.NET ValueTask(T)
BCL IAsyncEnumerable
Custom (incompatible)
β Native
.NET target
netstandard2.0+
netstandard2.1+
Fable support
β Yes
β No
await in CE
let! with Async(T)
let! with Task(T) or Async(T)
TaskSeq v0.4.0 added 25+ new functions (forall, skip, take, insertAt, min/max, etc.) and moved from module TaskSeq to type TaskSeq (static class) to allow overloads β a breaking but cleaner design.
Opportunity for AsyncSeq: Many of these missing functions (e.g. skip, take, insertAt, updateAt) are also absent from AsyncSeq. A function-parity project could significantly improve the library's ergonomics.
Rx.NET & AsyncRx.NET
The dotnet/reactive repository now maintains four libraries:
System.Reactive (Rx.NET): Push-based IObservable(T), mature and widely used.
AsyncRx.NET (experimental):IAsyncObservable(T) β think Rx.NET but with async/await-friendly subscriptions. Still in preview.
System.Linq.Async: Standard LINQ operators for IAsyncEnumerable(T).
The free ["Introduction to Rx.NET 2nd Edition"]((introtorx.com/redacted) eBook was updated for Rx.NET v6.0 and .NET 8 β a valuable free resource for anyone thinking about reactive patterns.
Conceptual note: AsyncSeq sits in an interesting middle ground between IEnumerable(T) (pull, sync) and IObservable(T) (push, async). The library already has ofObservableBuffered and toObservable, which is a nice bridge.
π .NET & F# Platform News
.NET 11 Preview 1 was released in February 2026. .NET 10 (GA, 2025) is the current LTS.
F# on .NET 10 (Preview 7+):
and! support in TaskBuilder β enables parallel awaiting of independent tasks inside task { }. This is directly relevant to async computation patterns.
"Make unused bindings an error" β improves code quality enforcement.
Improvements to computation expression error reporting (better ranges for let!, use!, match!, return!).
Fable 4.29.0 (stable) / 5.0-alpha: Fable 5 is expanding Python support aggressively β Python 3.10+ match statements, async def for task { } expressions, and ABC base classes. AsyncSeq already has Fable support; monitoring Fable 5 compatibility will be important as the Python target matures.
π‘ New Ideas & Opportunities
1. AI/ML Pipeline Integration
With the explosion of LLM streaming APIs (SSE, chunked HTTP), AsyncSeq is a natural fit for consuming token streams from models like GPT or Claude. An AsyncSeq.ofHttpResponseStream or integration example with System.Net.Http.HttpClient response streams could attract a new user base. The library's pull-based model maps well to reading SSE event streams or paginated API results.
2. Back-Pressure and Rate Limiting
Issue #174 questions whether iterAsyncParallelThrottled belongs in this library. A more principled answer might be a small set of back-pressure combinators β throttle, debounce, buffer with overflow strategies β similar to what Akka Streams or Reactor provide. These patterns are increasingly relevant for event-driven microservices.
3. chunkBy β Grouping Adjacent Elements
Issue #156 requests a chunkBy : ('T -> 'Key) -> AsyncSeq<'T> -> AsyncSeq<'Key * AsyncSeq<'T>> style function for run-length grouping. This is distinct from chunkBySize β it groups adjacent elements with the same key (useful for log parsing, event stream partitioning). It's a low-hanging fruit with clear use cases.
4. Cancellation Improvements
Issue #122 notes that iterAsyncParallel and iterAsyncParallelThrottled may fail to cancel. With AI agents running long parallel pipelines, robust cancellation is more important than ever. A systematic audit of cancellation token propagation across all parallel combinators would be valuable.
5. Bridging AsyncSeq β IAsyncEnumerable
As the BCL IAsyncEnumerable(T) becomes more prevalent (used by EF Core, Azure SDK, ML.NET, etc.), smooth interop is crucial. The ofAsyncEnum/toAsyncEnum PR (#214) is a step forward. A longer-term goal could be making AsyncSeq<'T> implement BCL IAsyncEnumerable<'T> directly (in a netstandard2.1+ target), while keeping the custom interface for netstandard2.0 and Fable.
π Enjoyable Anecdote
The library's asyncSeq { } computation expression was one of F#'s early demonstrations that computation expressions could elegantly model lazy async pull sequences β years before .NET's IAsyncEnumerable existed. When .NET finally standardised async enumeration in 2019, F# developers had already been doing it ergonomically for years. The community's response was essentially: "Yes, we know, we've had this since 2012."
This head-start has a small ironic downside: the library's custom IAsyncEnumerable type predates and differs from the BCL type of the same name, leading to the naming collision tracked in issue #159. A classic case of being too early.
π Related Research
"Asynchronous Sequences and Coroutines in F#" β Tomas Petricek's original design (2011-2013), which laid the conceptual groundwork for the library.
Reactive Streams specification (reactive-streams.org): a vendor-neutral standard for async stream back-pressure; comparing AsyncSeq semantics to this spec could reveal gaps and alignment opportunities.
"Project Loom" (Java): Virtual threads + structured concurrency in Java 21+ are solving similar problems in the JVM world. The StructuredTaskScope API (fork/join with cancellation) is conceptually close to what iterAsyncParallel does.
Kotlin Flow: Coroutine-based cold streams (similar mental model to AsyncSeq). Kotlin Flow's operators like flatMapMerge, buffer, and conflate could inspire future AsyncSeq additions.
AI-generated content by Agentic Weekly Researcher (Ruby), may contain mistakes.
ποΈ Weekly Research Report
FSharp.Control.AsyncSeq & the Async Streaming Ecosystem β February 2026
π¦ Repository Pulse
The repo has been active with a flurry of agentic CI automation added this week: an AI co-maintainer workflow (running every 4 hours) and a daily performance-improver agent were both merged. The latest stable release is v3.3.1, which delivered:
mapAsyncUnorderedParallelfunction for improved parallel throughputiterAsync,iteriAsync,collect,mapAsync, andunfoldAsyncappend(issue Memory leak when unfoldingΒ #35)The recently merged PR #204 renamed
bufferByCountβchunkBySize, aligning with .NET naming conventions (Seq.chunkBySize).Open PR to watch: #214 β Expose
ofAsyncEnum/toAsyncEnumonnetstandard2.0. The root cause was a mismatch between the.fsisignature file (gating behindNETSTANDARD2_1) and the.fsimplementation (which already supportsNETSTANDARD || NET). A one-line.fsifix could resolve the long-standing issue #173.π₯ Hot Issues & Community Feedback
FSharp.Control.IAsyncEnumerablein favor of BCL typetryFirstshould return the fastest result (likeAsync.Choice)mapAsyncParallelsemantics and alternativeschunkByβ group adjacent elements sharing the same keyiterAsyncParallelmay fail to cancelThe
IAsyncEnumerablenaming collision (issue #159) is particularly subtle: because this library defines its ownFSharp.Control.IAsyncEnumerable<'T>with different semantics (usingAsync<'T option>forMoveNextrather thanValueTask(bool)), users who open bothFSharp.Control.AsyncSeqandFSharp.Control.TaskSeqnamespaces can get confusing type errors. The long-term answer is likely a major-version deprecation.The
tryFirst/Async.Choiceidea (#205) is genuinely interesting: when elements are produced by parallel async computations, returning the first available rather than the first in order better matches the parallel mental model. The analogy toAsync.Choiceis apt.π Ecosystem & Competitive Landscape
FSharp.Control.TaskSeq (v0.4.0, March 2024)
The main sibling/competitor in the F# async-sequence space. TaskSeq implements
IAsyncEnumerable<'T>from the .NET BCL using resumable state machines (the same engine astask { }), withValueTaskfor performance. The choosing guide summarises the trade-offs:Async(T)ValueTask(T)IAsyncEnumerablenetstandard2.0+netstandard2.1+awaitin CElet!withAsync(T)let!withTask(T)orAsync(T)TaskSeq v0.4.0 added 25+ new functions (
forall,skip,take,insertAt,min/max, etc.) and moved frommodule TaskSeqtotype TaskSeq(static class) to allow overloads β a breaking but cleaner design.Opportunity for AsyncSeq: Many of these missing functions (e.g.
skip,take,insertAt,updateAt) are also absent from AsyncSeq. A function-parity project could significantly improve the library's ergonomics.Rx.NET & AsyncRx.NET
The dotnet/reactive repository now maintains four libraries:
IObservable(T), mature and widely used.IAsyncObservable(T)β think Rx.NET but withasync/await-friendly subscriptions. Still in preview.IAsyncEnumerable(T).The free ["Introduction to Rx.NET 2nd Edition"]((introtorx.com/redacted) eBook was updated for Rx.NET v6.0 and .NET 8 β a valuable free resource for anyone thinking about reactive patterns.
Conceptual note: AsyncSeq sits in an interesting middle ground between
IEnumerable(T)(pull, sync) andIObservable(T)(push, async). The library already hasofObservableBufferedandtoObservable, which is a nice bridge.π .NET & F# Platform News
and!support inTaskBuilderβ enables parallel awaiting of independent tasks insidetask { }. This is directly relevant to async computation patterns.let!,use!,match!,return!).async deffortask { }expressions, and ABC base classes. AsyncSeq already has Fable support; monitoring Fable 5 compatibility will be important as the Python target matures.π‘ New Ideas & Opportunities
1. AI/ML Pipeline Integration
With the explosion of LLM streaming APIs (SSE, chunked HTTP),
AsyncSeqis a natural fit for consuming token streams from models like GPT or Claude. AnAsyncSeq.ofHttpResponseStreamor integration example withSystem.Net.Http.HttpClientresponse streams could attract a new user base. The library's pull-based model maps well to reading SSE event streams or paginated API results.2. Back-Pressure and Rate Limiting
Issue #174 questions whether
iterAsyncParallelThrottledbelongs in this library. A more principled answer might be a small set of back-pressure combinators βthrottle,debounce,bufferwith overflow strategies β similar to what Akka Streams or Reactor provide. These patterns are increasingly relevant for event-driven microservices.3.
chunkByβ Grouping Adjacent ElementsIssue #156 requests a
chunkBy : ('T -> 'Key) -> AsyncSeq<'T> -> AsyncSeq<'Key * AsyncSeq<'T>>style function for run-length grouping. This is distinct fromchunkBySizeβ it groups adjacent elements with the same key (useful for log parsing, event stream partitioning). It's a low-hanging fruit with clear use cases.4. Cancellation Improvements
Issue #122 notes that
iterAsyncParallelanditerAsyncParallelThrottledmay fail to cancel. With AI agents running long parallel pipelines, robust cancellation is more important than ever. A systematic audit of cancellation token propagation across all parallel combinators would be valuable.5. Bridging
AsyncSeqβIAsyncEnumerableAs the BCL
IAsyncEnumerable(T)becomes more prevalent (used by EF Core, Azure SDK, ML.NET, etc.), smooth interop is crucial. TheofAsyncEnum/toAsyncEnumPR (#214) is a step forward. A longer-term goal could be makingAsyncSeq<'T>implement BCLIAsyncEnumerable<'T>directly (in a netstandard2.1+ target), while keeping the custom interface for netstandard2.0 and Fable.π Enjoyable Anecdote
The library's
asyncSeq { }computation expression was one of F#'s early demonstrations that computation expressions could elegantly model lazy async pull sequences β years before .NET'sIAsyncEnumerableexisted. When .NET finally standardised async enumeration in 2019, F# developers had already been doing it ergonomically for years. The community's response was essentially: "Yes, we know, we've had this since 2012."This head-start has a small ironic downside: the library's custom
IAsyncEnumerabletype predates and differs from the BCL type of the same name, leading to the naming collision tracked in issue #159. A classic case of being too early.π Related Research
StructuredTaskScopeAPI (fork/join with cancellation) is conceptually close to whatiterAsyncParalleldoes.flatMapMerge,buffer, andconflatecould inspire future AsyncSeq additions.