Lluis Franco & Alex Casquete
.NET Conference 2015
Y
A
X B
Evolution of the async model
Async void is only for top-level event handlers.
Use the threadpool for CPU-bound code, but n...
ayudarte con algo
It seems you’re calling an async
method without awaiting…
Can I help you?
Yep! Return a Task object plz
...
private async void Button1_Click(object Sender, EventArgs e) {
Thread oThread = new Thread(new ThreadStart(myExpensiveMeth...
private async void Button1_Click(object Sender, EventArgs e) {
//Thread oThread = new Thread(new ThreadStart(myExpensiveMe...
private int myExpensiveMethod()
{
...
return 42;
}
private void Button1_Click(object sender, EventArgs e)
{
var function =...
var numbers = Enumerable.Range(1, 10000000);
var query = numbers.AsParallel().Where(n => n.IsPrime());
var primes = query....
var customers = Customer.GetSampleCustomers();
Parallel.ForEach(customers, c => {
if(!c.IsActive) c.Balance = 0;
});
public static List<string> GetNetworkSQLServerInstances() {
//Get local network servers
return servers;
}
private void upd...
public static List<string> GetNetworkSQLServerInstances() {
//Get local network servers
return servers;
}
private void upd...
public static List<string> GetNetworkSQLServerInstances() {
//Get local network servers
return servers;
}
private void upd...
public static List<string> GetNetworkSQLServerInstances() {
//Get local network servers
return servers;
}
private void upd...
public static Task<List<string>> GetNetworkSQLServerInstancesAsync() {
//Get local network servers
return servers;
}
//ASY...




//ASYNC/AWAIT version
private async void Button1_Click(object sender, EventArgs e) {
var servers = await GetNetwor...
public static void SimpleBody() {
Console.WriteLine("Hello, Async World!");
}
.method public hidebysig static void SimpleB...
public static async Task SimpleBody() {
Console.WriteLine("Hello, Async World!");
}
.method public hidebysig static class ...
private async void Button1_Click(object Sender, EventArgs e) {
try {
SendData("https://secure.flickr.com/services/oauth/re...
private async void Button1_Click(object Sender, EventArgs e) {
try {
SendData("https://secure.flickr.com/services/oauth/re...
private async void Button1_Click(object Sender, EventArgs e) {
try {
SendData("https://secure.flickr.com/services/oauth/re...
// Q. It sometimes shows PixelWidth and PixelHeight are both 0 ???
BitmapImage m_bmp;
protected override async void OnNavi...
// A. Use a task
Task<BitmapImage> m_bmpTask;
protected override async void OnNavigatedTo(NavigationEventArgs e) {
base.On...
' In VB, the expression itself determines void- or Task-returning (not the context).
Dim void_returning = Async Sub()
Awai...
// table1.DataSource = LoadHousesSequentially(1,5);
// table1.DataBind();
public List<House> LoadHousesSequentially(int fi...
// table1.DataSource = LoadHousesInParallel(1,5);
// table1.DataBind();
public List<House> LoadHousesInParallel(int first,...
end1
start1
end2
start2
end3
start3
end4
start4
end5
start5
3end3
start3
end1
start1
end2
start2
end5
start5
response out
~200ms
Parallel.For
end3
start3
end4
start4
end2
start1
start2
start3
start4
start5
response out
~100ms
end5
end1
end3
end4
// table1.DataSource = await LoadHousesAsync(1,5);
// table1.DataBind();
public async Task<List<House>> LoadHousesAsync(in...
public async void btnPayout_Click(object sender, RoutedEventArgs e)
{
double initialPrice, strikePrice, drift, volatility ...
public async void btnPayout_Click(object sender, RoutedEventArgs e)
{
double initialPrice, strikePrice, drift, volatility ...
public async void btnPayout_Click(object sender, RoutedEventArgs e)
{
double initialPrice, strikePrice, drift, volatility ...
Foo();
var task = FooAsync();
...
await task;
synchronous
perform
when it’s done
asynchronous
initiate
immediately
public static void PausePrint2() {
Task t = PausePrintAsync();
t.Wait();
}
// “I’m not allowed an async signature,
// but ...
The threadpool is an app-global resource
In a server app, spinning up threads hurts scalability
The app is in the best pos...
async Task LoadAsync() {
await IO.Network.DownloadAsync(path);
}
void Button1_Click(){
var t = LoadAsync();
t.Wait();
Upda...
server scalability.
We all know sync methods are “cheap”
public static void SimpleBody() {
Console.WriteLine("Hello, Async World!");
}
.method...
Not so for asynchronous methods
public static async Task SimpleBody() {
Console.WriteLine("Hello, Async World!");
}
.metho...
public static async Task<int> GetNextIntAsync()
{
if (m_Count == m_Buf.Length)
{
m_Buf = await FetchNextBufferAsync();
m_C...
var x = await GetNextIntAsync(); var $awaiter = GetNextIntAsync().GetAwaiter();
if (!$awaiter.IsCompleted) {
DO THE AWAIT/...
public static async Task<int> GetNextIntAsync()
{
if (m_Count == m_Buf.Length)
{
m_Buf = await FetchNextBufferAsync();
m_C...
The heap is an app-global resource.
Like all heap allocations, async allocations can contributing to hurting GC perf.
Sync context represents a “target for work”
“Await task” uses the sync context
“where you were before”
But for library cod...
UI responsiveness
Lluis Franco & Alex Casquete
Async best practices
¡¡¡Si te ha gustado no olvides
rellenar la encuesta!!!
Thanks
Y
A
X B
Async Best Practices
Async Best Practices
Async Best Practices
Async Best Practices
Async Best Practices
Async Best Practices
Async Best Practices
Async Best Practices
Async Best Practices
Async Best Practices
Async Best Practices
Async Best Practices
Async Best Practices
Async Best Practices
Async Best Practices
Async Best Practices
Async Best Practices
Async Best Practices
Async Best Practices
Próxima SlideShare
Cargando en…5
×

Async Best Practices

3.414 visualizaciones

Publicado el

Async Best Practices Speech @ DotNet Spain Conference 2015

Publicado en: Tecnología
0 comentarios
6 recomendaciones
Estadísticas
Notas
  • Sé el primero en comentar

Sin descargas
Visualizaciones
Visualizaciones totales
3.414
En SlideShare
0
De insertados
0
Número de insertados
1.391
Acciones
Compartido
0
Descargas
57
Comentarios
0
Recomendaciones
6
Insertados 0
No insertados

No hay notas en la diapositiva.
  • * And so on to the first of four sections of this talk.
    * Async void is only for event-handlers.
    * I'll motivate it with developer stories.
    * Actually, all the scenarios in this talk come straight from developers, and most of the code.
    [CLICK]
    * "I have a Silverlight page that uses RIA services async to load the data for the page."
    * "This works fine if the user waits for a few seconds before selecting the print button."
    * "But does *not* work if the user prints right away."
    * "If the user clicks the Print button before all of the page data is loaded, the printed output does not have all of the data."
    * Diagnosis: she was using async void deep inside her code.
    * Fix: should return Task from her internal async methods, not void.
  • * Let me put that more strongly
    * For goodness' sake, stop using async void everywhere.
    * (At first that was going to be the title of my talk)
  • * Actually, I won't show her code because it was a bit involved.
    * I'll show someone else who made the exact same mistake.
    * Their async method returns void rather than Task.
    * This code actually comes from Microsoft's own official Win8 SDK samples!
    * Goes to show that it's a common mistake that anyone can make.
    [CLICK]
    * Clicks a button, invokes the handler
    [CLICK]
    * Invokes SendData, which kicks off request for data and then awaits response
    * You know what happens now. At the first await, control returns straight back to the caller.
    [CLICK]
    * Normally at this point the caller would await until SendData finishes.
    * But SendData returned void, not Task, so the caller can’t do that.
    * Instead it awaits a Task Delay, so returns back to its own caller, the UI message loop
    [CLICK]
    * Some time later, the response will come back. Or the delay will finish.
    * Don't know which will happen first.
    * Maybe it'll assign to m_GetResponse first. Or maybe not.
    * That's what the developer said "My code doesn't work 100% reliably".
    * Had obviously experimented with Task.Delay until they got the right delay to work on their dev network!
    * The problem is all down to this async void SendData.
    * It's a void, right. It doesn't return anything to its caller.
    * The caller can't do anything with it.
    * The caller is UNABLE to know when SendData has finished.
    * It's basically fire-and-forget.
    * That's the crux of the problem, fire-and-forget.
  • * Actually, before we go on to fix it, I want to highlight another problem with fire-and-forget async voids.
    * Let's comment out the problematic race condition
    * and see how exceptions behave from a fire-and-forget method.
    * We'll focus on the try/catch, to catch exceptions arising from SendData.
    [CLICK]
    * Once again, we invoke the handler
    [CLICK]
    * It calls SendData.
    [CLICK]
    * As you know, at the first await, it returns to its caller.
    * The thing is, at this stage, there's been no exception yet, so nothing gets caught.
    * We breeze through the catch block and return to the UI.
    [CLICK]
    * Now the network request comes back, maybe with a 404 error.
    * And SendData throws an exception.
    * But where can an exception go out of a fire-and-forget method?
    * Can't go back to Button1_Click, because that's already finished.
    * Answer is that all exceptions from these fire-and-forget async voids
    get posted straight to the UI thread.
    * In Win8, terminates app. In Phone, silently swallowed. In WPF, dialog.
    * In no cases is that desirable.
  • * We've seen that async void is a "fire-and-forget" mechanism
    * Meaning: the caller is *unable* to know when an async void has finished.
    * And the caller is *unable* to catch exceptions from an async void method.
    * Guidance is to use async void solely for top-level event handlers.
    * Everywhere else in code, like SendData, async methods should return Task.
    * There’s one other danger, about async void and lambdas, I’ll come to it in a moment.
  • * But first let’s fix SendData.
    * SendData should return Task, not void.
    * Convention: every method that returns Task has a name ending with Async
    * The caller sees that name and knows he should await it.
    * And we can get rid of that awful Task.Delay rubbish.
  • * Well, we've said async void is for fire-and-forget
    * And the only place that's appropriate is for event-handlers, or event-like things.
    * What do I mean by "event-like things"? Sometimes it's hard to know.
    * Let's look at this case.
    * I was wondering how bad the problem is of people misusing async void.
    * I looked through the MSDN forums for "async void" and "problem".
    * A lot of hits came back from this function "async void LoadState"
    * You might not know about it.
    * In Win8 apps. When you get to a page, it fires the NavigatedTo event
    * The base class handles the event with an overridable void-returning method OnNavigatedTo.
    * So that method's basically like an event-handler, fire-and-forget. It's fine to be async.
    [CLICK]
    * First thing it does is call its base method.
    * If the page had already been shown before, it just returns.
    * But just for the first time that a page is shown, it invokes the virtual void method LoadState.
    * So LoadState is also basically like an event-handler, fire-and-forget. It's fine to be async void.
    * OnNavigatedTo is called every time you navigate to a page. LoadState called is called only once per time the page has been constructed.
    [CLICK]
    * Maybe you can see where this is going...
    * Let's trace it out.
    * We get to a page. Invoke the OnNavigatedTo virtual method, fire-and-forget.
    * Which calls its base.
    [CLICK]
    * Which kicks off LoadState, fire-and-forget, which we’ve overridden.
    * It does an await
    [CLICK]
    * Which goes back to its caller, who does an await, and returns to the UI message-loop
    [CLICK]
    * But now, there are two fire-and-forget async voids in flight. Which one will go first?
    * Will it be the bottom one who loads the bitmap?
    * Or the top one you uses the bitmap, and assumes it's already loaded?
    * The forums question you hear is "Why is PixelWidth 0?"
    * It's because they're querying a bitmap that hasn't loaded yet.
  • * Well, the answer here is to use a task.
    * It would have been easier to change LoadState to be a Task-returning async method.
    * But we can’t do that. We don’t control the signature. We’re just overriding it.
    * So instead we’ll have to pass the Task back an alternative way.
    * Here, LoadState kicks off an async method that loads a bitmap
    * But not fire-and-forget.
    * Oh no. Instead, it'll rememer that task, and save it in m_bmpTask field.
    * That way, OnNavigatedTo can await for the same task to finish.
  • * There's just one other surprise place where async voids will bite you
    * In C#, when you write an async lambda, it can be either void-returning or Task-returning.
    * The syntax of the lambda doesn't tell you which.
    * Instead, it's the context that tells you.
    * Here I’ve assigned same async lambda to both void-returning Action delegate, and Task-returning Function delegate. No compiler errors. Both work fine.
    * Look at this call to Task.Run. It passes an async lambda.
    * Will that be void-returning or Task-returning?
    [CLICK]
    * Well, if both overloads are offered, it'll pick Task-returning. Good!
    [CLICK]
    * In VB, the situation's different
    * Here it's not the context that decides if it's void-returning Sub or Task-returning Function.
    * Instead the expression itself says which it is.
    * But the conclusion's the same. The method you call should generally both overloads.
  • * Let's see async lambda problems in practice
    * Here I'm writing a Win8 app which invokes Dispatcher.RunAsync
    * I'm passing it an async lambda.
    * Whenever I see an async lambda being passed to a function, I always check that function.
    * Look at the bottom of the slide.
    * In this case, it takes something called a DispatchedHandler,
    * which is void-returning.
    * So it's passing a void-returning async.
    [CLICK]
    * We can imagine what will happen.
    * The dispatcher will kick off the lambda, fire-and-forget.
    * The lambda will get to the first await.
    [CLICK]
    * It'll return it its caller. And the dispatcher will think it's done.
    [CLICK]
    * So our await on the dispatcher will finish, and we'll plow through the rest of our method.
    [CLICK]
    * Meanwhile, m_Result doesn't get set until too late,
    * and the exception from our async lambda was never caught.
    * That's because our async lambda was void-returning, fire-and-forget.
    [CLICK]
    * You understand the problem.
    * We’ll touch on a solution in next section. It’s subtle.
  • * But first, let's sum up.
    * "For goodness' sake, stop using async void"
    * That's because async void means fire-and-forget.
    * And fire-and-forget is only appropriate for event-handlers.
  • * Now section two of four.
    * I want to talk about the threadpool, about IO- and CPU-bound workloads.
    * Let's hear what the developer had to say. He said:
    [CLICK]
    * "I'm now looking at the biggest user complaint about a slow running operation in an ASP.NET WebForms page."
    * "Essentially, the page loads some data and I'm wondering if it'd be the best approach to use the Task Parallel Library."
    * "The method itself deserializes an object and depending on user choices can call the method in a foreach 26+ times, the result of which I bind to a gridpanel."
    * "The deserialization itself is where 99% of the time is being spent."
    * Well, that's brilliant! He used profiling first. He identified the problem area.
    * The punchline is that is code turned out not to be CPU-bound, and so he should have been using await.
  • * But let’s look at what the developer started with.
    * It's a zillow-like housing app. He's deserializing a load of houses,
    and databinding them to his webform.
    * And we’ll start by taking him at his word that the deserialization work is CPU-bound.
    [CLICK]
    * If we draw a flow chart of it, the request comes in, then it does
    one house after another, and then finishes.
    * If each house takes 100ms to deserialize, and he does five houses, then
    it'll be 500ms before the user sees anything in his web-browser.
  • * This is what the developer tried using the TaskParallelLibrary.
    * He used Parallel.For, to deserialize all the houses in parallel.
    * This lambda is the work for each house that has to be done.
    [CLICK]
    * Let's draw a flow-chart for it.
    * First a request comes in.
    * Then he does Parallel.For, which means that five lambdas will have to be executed eventually.
    * Then the threadpool does those five pieces of work.
    * The threadpool will run them on as many threads as will be fastest.
    * My laptop has two CPU cores -- a boy and a girl, you can see, so two cores will be fastest.
    * If each request takes 100ms, then we'll get the answer out in 300ms.
    That's an improvement!
    * Actually, it might not be. If we're running on a server that has other workload as well,
    then one of the cores will probably be taken, so we'll only have one.
  • * Oh. Just hold on there a moment.
    * What the heck kind of deserialization takes so long? 100ms per house? That's an eternity.
    * Well, I checked with the developer.
    * Turns out his deserialization wasn't really what I'd call deserialization.
    * It was looking up tables in a database.
    * That's why it took so long. It was network-bound, not CPU-bound.
  • * So this is what his first sequential code was actually doing.
    * It was downloading data for each house, one after the other.
    * But it only took a miniscule amount of time to kick off each request,
    then it was idle for about 100ms,
    then it got back the response from the network.
  • * But let's look back at how his Parallel.For code was behaving.
    * Well, as we said, it had five workitems in the threadpool.
    * Let's say the threadpool started with two threads, because of my two cores.
    [CLICK]
    * Gradually it'll realize that its threads aren't really being used,
    and it'll add an extra thread to do some more work.
    [CLICK]
    * Maybe an extra thread as well.
    * The threadpool will gradually find the optimum number of threads to run
    a given workload, but it's fairly slow to respond.
    [CLICK]
    * In this case maybe it only ended up growing by two extra threads.
    * Well, this result came in about 200ms.
    * In general, threadpool growth isn't the right way to get responsive code.
    * That's because it does take time to get there.
    * Sometimes you'll see it adding just one new thread a second.
  • * Let's draw a flow diagram about how this code should ideally work.
    * We should kick off all five requests in one go.
    * We might as well issue the requests in sequence, since it's so quick to issue a request.
    * Later on, about 100ms later, the responses trickle in.
    * They might come out of order. That doesn't matter. We'll get them all.
    * And we should have them all done within about 100ms.
    * That's the fastest "Time To First Byte" of all our solutions.
  • * Back to the developer's scenario.
    * This is the code that the developer should have used
    * He can kick off tasks for all the database loads.
    * And then await Task.WhenAll, until they're all finished.
  • * What would we do if there weren’t 5 houses but 500?
    * Can’t make 500 requests all at once. We need to throttle the rate of requests.
    * Here’s what I think is the easiest idiom.
    * A queue of work-items.
    * Async method WorkerAsync runs through the queue and fires requests, one after the other.
    * And if I kick off three of these workers, then I’ll have throttled it to 3 at a time!
  • Calling F# (CPU-bound demo)
  • Calling F# (CPU-bound demo)
  • Iprogress interface
  • * So let's review.
    * It's vital to distinguish between what is CPU-bound work and what is IO-bound work.
    * CPU-bound means things like LINQ-to-objects, or iterations, or computationally-intensive inner loops.
    * Parallel.ForEach and Task.Run are good ways to put these CPU-bound workloads on the threadpool.
    * But it's important to understand that threads haven't increased scalability
    * Scalability is about not wasting resources
    * One of those resources is threads
    * Let's say your server can handle 1000 threads.
    * If you had 1 thread per request, then you could handle 1000 requests at a time.
    * But if you created 2 threads per request, then only 500 requests at a time.
  • * The first library tip is that library method signatures shouldn't "lie".
    * (that's not why they're called lie-braries).
    * If a method look async, if it smells async, then it should be async.
    * Let's see what that means in practice.
  • Here I’ve written two different methods that someone might call.
    The first, in yellow, is synchronous.
    The second, in blue, is asynchronous.
    * So what are the assumptions people will make about how to call these two?
    * Imagine someone comes up to your API.
    * They're going to read the documentation.
    * Hah! Who am I kidding? They might read the XML doc-comments if we're lucky.

    For the first one, they’ll see its name, and expect it to be synchronous.
    Everyone knows what that means.
    They think it will perform something right away
    and will only return once it’s finished its work
    It’ll probably be using CPU all the time it’s running.
    [CLICK]
    For the second one, they’ll see its name ends in Async.
    They’ll think they can call the method to initiate something, but they’ll get back control immediately
    Maybe they’re writing a server app. They’ll expect that the method isn’t going to spawn new threads or use up CPU in the background. They can trust it to be a good citizen on their server.
    * They also know that they can parallelize it.
    * Maybe it's a download API. They can kick off 10 downloads simultaneously, just by invoking it 10 times and then awaiting Task.WhenAll.
    And it's not going to hurt their scalability to do so.


    * The thing is, your callers will look at your signature of your method, and they'll make assumptions right or wrong about how you're implemented underneath.
    It'll be your job to stay in line with those expectations.
  • This talk about sync or Async is important, because it will affect how you architect your async APIs.
    * Let's spell it out with some concrete examples
    * Just some code to pause 10 seconds, then print "Hello".
    * I know, it's not much of a library, but it's a start!
    [CLICK]
    * This is just an example… I'm not suggesting you do this at home!
    * It’s an example of an API that's synchronous in both senses...
    * Its signature looks synchronous, and its implementation really does block the calling thread. Actually it’s even worse than that, it burns CPU cycles to do so.
    [CLICK]
    * And here's an example of an API that's asynchronous in both senses.
    * It uses TaskCompletionSource to generate a Task
    * It schedules a timer to wake it up in 10 seconds time
    * And when the timer wakes up, it prints to the screen and marks the Task as completed.
    Hardly any CPU used at all. It's all just scheduling.
    (Some people ask, “What about Timer itself? Are you saying that Timer’s own internal implementation is Async as well? Is it just turtles all the way down?
    Well, yes. You know in the Task Manager where it shows System Idle taking up 95% of your CPU? It’s not really burning CPU. It’s probably switched the CPU to a low-power state and is waiting for the next hardware interrupt.)
    [CLICK]
    * Actually, we wouldn't write it that way. We'd write it using an "await".
    * But the two implementations here are basically the same.
    [CLICK]
    * It's what I'd call "true async"
    [CLICK]
    * Here's another piece of code. Let's study this.
    It uses Task.Run to run the synchronous code on the threadpool
    So, it’s blocking up a threadpool thread.
    I see people doing this quite a lot. Maybe they’ve heard that async is good, and they want to offer up an async signature, but they’re calling an underlying library that is synchronous.
    [CLICK]
    * There's something fishy about this, isn't there?
    The signature looks async, it smells async, but the implementation is burning CPU.
    It’s fine for an application to use the threadpool in this way if it wants, but it’s bad for a library to secretly use Task.Run internally. I’ll discuss why.
    [CLICK]
    * I want to show you one last example.
    This routine initiates an async operation.
    But then it blocks its thread, with the blocking call to Wait().
    I see this often, or equivalently using the blocking property Task.Result.
    Often it’s because people are writing code that fits into a larger synchronous framework, but they need to make one small call to an Async method.
    Sometimes it’s from people do this because they wanted to offer a synchronous API from their library, as well as Async.
    [CLICK]
    * But there's something fishy about this too.
    * The signature looks synchronous, it smells synchronous, but the underlying implementation is async.
    * When we're writing library APIs, we should try to stick to the top-left or bottom-right.
    The other two styles, with the arrows, are dangerous. Confusing to users of the API.
    [CLICK]
    I want to dig deeper into the two fishy patterns, the orange and the red.
    Just a tip, if you’re forced into the bottom-left scenario, this link has some workarounds.
  • So what’s so fishy about libraries that use Task.Run internally?

     
    [VS: 1.LibrariesShouldntLie]
    * Simple app, console app, but I'm just spinning up a Winforms dialog here.
    * This is the minimal code I need to get a UI message-loop. (don't want rest of plumbing)
     

    class Library
    * It's a demo of a library, so we'll have three layers: the app that uses the library, then the library itself, then the underlying framework functionality that the library uses.
    * Here's one way to write the library. It wants to offer up an asynchronous API
    * And in this version, it's using the synchronous OS API. Maybe that's the only one available.
    * So to become async, my API needs to wrap it, with await Task.Run
    * It's the top-right quadrant. It looks async, but it's wrapping an implementation that's synchronous.
    * Probably to avoid blocking the calling thread.
     
    b.Click += async delegate
    * And here's what the app developer wrote, the user of my library.
    * They want to be asynchronous, they want to stay responsive.
    * But say they don't want just one, but they want to download 100 files.
    * They saw that it was an async method, so they trusted they could just kick off all the tasks and then await Task.WhenAll
     
    [RUN]
    * Now it has kicked off all 100 of those tasks.
    * But because each one wants to use a background thread, it's actually going in bursts.
    * I have four logical cores on this laptop, so the threadpool starts by giving me four threads. As many threads as we have cores.
    * Then it looks a second later, says it looks like you've made poor use of those threads, most of them were idle, waiting on IO
    * So it looks like you need more threads
     
    [RUN]
    * See the first batch was 4, then next batch was 5, then 6
    * The threadpool has this predefined scaling behavior, hill-climbing
    * So I've had to wait until the threadpool catches up to me, until it eventually finds its optimal number.
    * But actually my app didn't need any threads.
    * As an app author, I didn't even think any threads were involved.
    * That's the key. You don't want to go messing with things that aren't yours, global resources.
    * And the threadpool is one of those things.
    * It belongs to the app developer, not to you the library author.
    * They might have their own ideas about how they want to use the threadpool.
     
    var contents = await IO.DownloadFileAsync()
    * Now this one's pure async
     
    [RUN]
    * And this time all 100 files can download at the same time.
    * This is what we'd expect.
    * I shouldn't have to block waiting for the threadpool to grow
    * I just have the assumption that I'm just kicking off work from the UI thread.
    * You don't want to be a library author who violates that assumption

    [SLIDES]
    We have to think of the threadpool as an app-global resource.
    Remember that hill-climbing that we saw. It’s done across all code across all libraries in the app.
    * In a server app, spinning up a bunch of threads hurts scalability.
    * I don't want to create new threads, because my caller might be relying on those other threads to be request-threads, to handle new incoming requests.
    * And imagine if my library uses Task.Run deep inside - then it'll be a pain for users to diagnose.
    * It wasn't a mistake they made. It was a mistake for them to trust my API.
    * The app is in the best position to manage its threads.
    Let the user use their own domain-knowledge about what they're building to decide how they want to manage threads.
    * If your library's using Task.Run, you're putting in roadblocks that prevent the app from using its threads effectively
    * If the caller wants to go make some synchronous work happen on a background thread, let them do that themselves. It’s fine for your caller to use Task.Run. Just you shouldn’t do it in your library.
    * You should expose something that looks like what it is.
    If you only have an implementation that's synchronous, then expose as an API that's synchronous.
    Only provide async methods when you can implement them asynchronously.
    That will help your callers make the call on how to call you.
  • That showed you the dangers of the top-right quadrant.
    I want to show you the even worse dangers of the bottom-left quadrant:
    blocking code, that uses task.Wait() or similar.

    At the start of this “essential tips on Async” series, I explained how the message-loop works in a UI app with this diagram.
    [CLICK]
    The user clicks a button in the UI, and it invokes the message-handler, which calls a LoadAsync method
    [CLICK]
    That creates a “Task” and returns to its caller, where the task is assigned to variable t
    But now we did a terrible thing. We blocked, waiting until that task had completed.
    [CLICK]
    If you remember how Async works under the hood, once the DownloadAsync task has finished, it moves into the message-queue so that the message-pump can handle it.
    But our message-pump is now blocked, stuck on the task.Wait() call!
    This is a deadlock.
  • * Principle is that the threadpool is a global resource.
    * You as a library developer have to play nice, help the app author use their domain-specific knowledge as to how to create threads using Task.Run if they want.
    * Don't do it yourself.
    * Only show an async signature when your method is truly async.
    * Also, don't block in your library calls.
    * If you block on the UI thread, disaster.
    * If you block on a threadpool thread, you're hurting the threadpool because of that hill-climbing, hurting everyone else in the app who uses the threadpool.
  • * Next thing address for async libraries is some aspects of async perf.
    * Can I use await in an inner loop?
    * If I have a method "ReadByte" in my API, would I also want "ReadByteAsync()"? if I'm getting a million bytes one-at-a-time?
    * It's a question of how chatty vs chunky to design my API. Like peanut butter.
    * Most importantly, we'll understand how to optimize around some common special cases.
  • * Just to step back, we're all used to synchronous methods.
    * We know they're cheap. That's why we're happy to factor out our code at will, take out these lines of code and put them into a separate method
    [CLICK]
    * And if we look at the IL, we can see how simple it is.
    * I know, I know, IL isn't the best way to judge the cost of something. It’s just a starting point for comparison.
  • * For async methods, though, the compiler puts in plumbing, and it's not so simple
    * Here's the same method as the last slide, but it uses the async modifier
    [CLICK]
    * First, the compiler generates this code for the method
    [CLICK]
    * And actually this is only part of it.
    * Part of what the compiler generates is a structure with a MoveNext method, and here’s a call to this MoveNext method
    [CLICK]
    * And let's look at the MoveNext it generates.
    * It's the plumbing that lets an async method pause and resume.
    [CLICK]
    * If I highlight it, you can still see the core bit that corresponds to what I wrote
    * Let's set this in context. The code overhead isn't much. Equivalent to about an empty for loop about 200 times.
    * If you're doing it a few hundred times a second, doesn't matter.
    * It's just if you'll be using it in a tight inner loop, then you need to think.
    * And you know what? This IL might look scary, but it's actually more efficient than anything you'd write if you tried to do async callbacks by hand.
    * We've been able to optimize everything around await, use internal methods on Task, use detailed understanding of JIT.
    * It's not that the await keyword is particularly slow.
    * It's just there's a slight inherent overhead to async APIs.
    * Usually that overhead will be negligible.
    * It's just in a tight loop that it adds up.
    [CLICK]
    * Actually, the real mental model I want you have in your mind is that it's ALLOCATION that's expensive.
    * Technically, allocation is cheap, it's the garbage-collection afterwards that's expensive. Like getting drunk and then getting a hangover.
    * If you want to play nice with memory, you want to avoid allocating memory as much as possible.
    * I think of the heap as another app-global shared resource, and it's our job as library authors not to trample on it unfairly.
  • * For async methods, there are three particular allocations that show up.
    * It allocates a "state machine" class which holds all the method's local variables, and remembers which await it's got up to (so it can resume after it).
    * It allocates a delegate. Delegates in .NET are heap objects. It allocates a delegate that it signs up for when a Task is complete, to execute that delegate
    * And your async method returns a Task object. Which is a heap object. So each async method allocates that.
    [CLICK]
    * But there are some really powerful optimizations here.
    * The core point is that async methods start executing immediately.
    * Like in this case, my method gets the next integer, but it downloaded the integers in chunks, in buffers
    * So 99.9% of the time it can return immediately. It doesn't need any awaits.
    * It's only when a chunk runs out that it needs the next one.
    * It's only when we get to the first await point that we return to our caller, and incur all those allocation costs.
    So if it happens that you never even get to an await point, never need to return to the caller until the end, it avoids the first two allocations entirely!
    Now there’s another important under-the-hood optimization which makes this optimization much more powerful…
  • We call it the "fast path"
    [CLICK]
    * Here's an await operator, and here's the codegen that the compiler makes for it.
    * When you await something, it first checks "is that thing already completed"
    * You might wonder, what kinds of tasks will be completed already before I even await them?
    * Well I just showed you one on the previous slide!
    * 99% of the time, GetNextIntAsync has already completed, so the await operator can fly over it really fast.
    * And because we flew over it, we didn't even return to the caller.
    * That's important. You might have heard the message "At the first await in an async method, it returns to its caller."
    * But that's not precisely true.
    * Really, "At the first await WHICH HAS NOT YET COMPLETED, it returns to its caller".
    * And this fast path has a great synergy with the previous slide.
    * Because if all our awaits take the fast path, then we're basically skipping over them, and we avoid those two heap allocations for the async method.
  • You might wonder, “what kind of Task would already be complete at the time I await for it?”
    Well, here’s a great example – 1023 times out of 1024, anyone who invokes GetNextIntAsync, they’ll get back a Task object that has already completed.
    So they’ll benefit from all the built-in fast-path optimizations themselves. It’s a virtuous circle.

    There’s one final memory allocation in an Async method: that is allocation of the returned Task object.
    [CLICK]
    * But if the method managed to take the fast path on all its awaits,
    * and if the returned value in your Task<T> was one of the "common" ones like 0, 1, true, false, null, then it avoids even allocating the task.
    * Also if your async method just returned non-generic Task.
    * That's because the framework keeps just a singleton copy of about ten common Task objects
    [CLICK]
    * That'll be good if you want to return Task<bool>, or an empty string in Task<string>
    * If you're returning some other value, like arbitrary integers, it doesn't make sense for the framework to have a singleton Task<int> for every single possible integer.
    * Or if you're returning a string, the framework can't have a singleton Task<string> for every possible string.
    * So in those cases you can cache a Task object yourself.
    So the first optimizations we saw around the fast-path and common values, they all happen automatically.
    * But this final optimization, caching the returned Task if it's not one of the common ones, that requires some work on your part.
     
    [VS: 3.CacheTasks]
    * Here I'm going to show you a typical pattern you can use to cache the returned Task, to avoid having to allocate a new Task object every single time.
     
    byte[] data = new byte[0x10000000[
    * I'm going to allocate a quarter of a gig, and measure how many allocations it needs to copy it.
     
    input.CopyToAsync(Stream.Null).Wait();
    * For the copying, I'll be using the .NET framework method CopyAsync
     
    int newGen0 = GC.CollectionCount(0)
    * And I'll be measuring how many times the GC had to run
     
    class MemStream1 : MemoryStream
    * What I'm testing is two different implementations of MemoryStream
     
    return Read(buffer, offset, count);
    * My test used Stream.CopyAsync, so I know its going to call into ReadAsync
    * This first implementation is just a simple async method
    * no awaits, so it always takes the fast path
    * But it returns the number of bytes read.
    * This isn't one of the common values, so it's not a singleton.
    * Instead it's going to allocate a new Task object every time this is called.
    * It happens that Stream.CopyAsync is using buffers of size 80k each time, so every Task<int> that it allocates will be Task<int> with value 81920
    * But it's still allocating a new copy of that every single time.
     
    private Task<int> m_cachedTask;
    * Let's look at this second memory-stream implementation.
    * This one keeps a cache of the last Task<int> it returned.
     
    if (m_cachedTask != null && m_cachedTask.Result == numRead)
    * And if the Task it's cached has the right value, well, it might as well return that.
    * A single Task object can be used as many times as you like after it's been completed.
    * After the Task has completed, it's immutable.
     
    m_cachedTask = Task.FromResult(numRead);
    * But if the cache wasn't there, or had the wrong value,
    * then we'll generate an already-completed task with the right value.
    * That's what Task.FromResult does.
     
    [RUN, CTRL+F5]
    * And there we see that we've saved an appreciable number of allocations.
    * I want to stress, it doesn't cache the last INTEGER it returned.
    * That'd miss the point. Our goal is to reduce the number of Task objects we allocate.
    * So we have to cache the Task<int>, not just the int.
    * One thing to ask, how big should our cache be?
    * Here I've just used a single-element cache. It only stores the previous one.
    * And what you'll find is that, generally, just a single-element cache works great!
     
    TrackGcs(new MemoryStream(data));
    * I just wanted to show you some more perf numbers.
    * Here I'm going to use the standard built-in MemoryStream
     
    [RUN IT, CTRL+F5]
    * What we see is that MemoryStream actually has some further internal optimizations to eliminate all GCs in this test.
    * You can go a long way. It's a question of how much time you want to spend as a library author, and how frequently your library APIs will be used, for how worthwhile it is.
     

    * In general, you as a library-author should use domain-knowledge about the nature of your API, to decide whether and how it makes sense to cache tasks.
    * We used it in the .NET framework to dramatically improve the performance of BufferedStream and MemoryStream.
  • * We've talked about perf considerations. They largely relate to the heap and GC, and avoiding unnecessary allocations.

    * Async/await keywords are as fast as they can be, and the inherent overheads are only noticeable in a tight inner loop. We're talking millions of iterations, not just a few hundred or thousand.
    * If you can't help, and the shape of your API means has to be called frequently, there are some great built-in perf features that happen automatically.
    * First, there’s the "Fast Path". If an await has already completed, then it just plows right through it.
    And if you get to the end of the method without any "slow-path" awaits, then you avoid a bunch of memory allocations.

    * Guidance is, try to avoid chatty APIs. Make APIs where the consumer of your library doesn't have to await in an inner loop.
    * You can GetNextKilobyteAsync() instead of GetNextBitAsync().

    * If you have to have a chatty API, we saw how to cache the returned Task<T> to remove the one last allocation on the fast path.
    * And a cache size of just "1" is often the right choice!

    * But remember, don’t prematurely optimize.
    * Async is not a bottleneck if you’re only doing it a few hundred or thousand times a second.
    * It’s only when you get more that you’ll need to think about Async perf.
  • * Final tip for Async library developers is to consider task.ConfigureAwait(false)
  • * I need to get technical. Talk about "SynchronizationContext".
    It represents a target for work
    It’s been in the framework for a while, but we generally haven’t had to worry about it.
    * For example, in Winforms, if you get the current SynchronizationContext and do Post on it, it does a Control.BeginInvoke. That's how Winforms gets onto the UI thread.
    * And ASP.Net current synchronization context, when you do Post() on it, it schedules work to be done in its own way.
    * There are about 10 in the framework, and you can create more.
    And this is the key way that the await keyword knows how to put you back where you were.
    [CLICK]
    So when you do await, it first captures the current SyncContext before awaiting.
    [CLICK]
    * When it resumes, it uses SyncContext.Post() to resume "in the same place" as it was before
    [CLICK]
    * For app code, this is the behavior that you almost always want.
    * When you await a download, say, you want to come back and update the UI.
    * But when you're a library author, it's rarely needed.
    * Say you’ve got a library method with an await in the middle of it.
    * You usually don't care which threading context you come back on to finish up the second half of your library method.
    * It doesn't matter if your library method finishes off on a different thread either, maybe the IO completion port thread.
    * That's because when the user awaited on your Async library method, then their own await is going to put them back where they wanted. They don't need to rely on you doing it for them.
    [CLICK]
    * And so in the framework we provide this helper method Task.ConfigureAwait.
    * Use it on your await operator.
    * Default, true, means the await should use SynchronizationContext.Post to resume back where it left off
    * If you pass in false, then if possible it'll skip that and just continue where it is, maybe the IO completion-port thread
    * Let's just stay there! is as good a place as any!
    [CLICK]
    If your library doesn't do this, and you're using await in an inner loop, then you're wasting the user's message-loop
    The user’s message-loop is an app-global resource.
    * You’re being a bad citizen, flooding THEIR UI thread with messages that don't have anything to do with them



    * demo...
     
    [VS: 2.ConfigureAwait]
     
    const int ITERS = 20000;
    * Repeat inner loop 20,000 times
     
    await t
    * This one does the default - it does capture and resume on the captured synchronization context
     
    await t.ConfigureAwait(false)
    * This one, same code, just resumes on whichever thread it left off. Likely the threadpool thread.
     
    [RUN, CTRL+F5]
    * We see a fifteen-fold difference.
    * If doing the loop 20,000 times, it adds up to half a second.
    * That's not much, just a few microseconds.
    * And it's completely irrelevant if you're only doing 10 or 100 awaits in your library method
    * But if you have an await inside your inner loop, or if your user will call you inside their inner loop, that's when it adds up
  • Again, this is a micro-optimization.
    If you only have a few tens of awaits per second, nothing to worry about. But otherwise…
    Principle is that the UI message-queue is an app-global resource.
    If the internal implementation of your library routine has awaits inside it, and your routine was called from the UI context, then it’ll wind up posting each of its awaits back to the UI thread.
    This is an abuse of the UI thread, which will hurt responsiveness.
    So you can use ConfigureAwait(false) to avoid that.
  • Async Best Practices

    1. 1. Lluis Franco & Alex Casquete .NET Conference 2015 Y A X B
    2. 2. Evolution of the async model Async void is only for top-level event handlers. Use the threadpool for CPU-bound code, but not IO-bound. Libraries shouldn't lie, and should be chunky. Micro-optimizations: Consider ConfigureAwait(false)
    3. 3. ayudarte con algo It seems you’re calling an async method without awaiting… Can I help you? Yep! Return a Task object plz Nope. Maybe latter.
    4. 4. private async void Button1_Click(object Sender, EventArgs e) { Thread oThread = new Thread(new ThreadStart(myExpensiveMethod)); oThread.Start(); ... oThread.Abort(); ... if(oThread.IsAlive) { ... } } private static void myExpensiveMethod() { //Some expensive stuff here... //Read from Database/Internet //Perform some calculations salaryTextBox.Text = result; }
    5. 5. private async void Button1_Click(object Sender, EventArgs e) { //Thread oThread = new Thread(new ThreadStart(myExpensiveMethod)); //oThread.Start(); ThreadPool.QueueUserWorkItem(p => myExpensiveMethod()); } private static void myExpensiveMethod() { //Some expensive stuff here... //Read from Database/Internet //Perform some calculations if (salaryTextBox.InvokeRequired) salaryTextBox.Invoke(new Action(() => salaryTextBox.Text = result)); }
    6. 6. private int myExpensiveMethod() { ... return 42; } private void Button1_Click(object sender, EventArgs e) { var function = new Func<int>(myExpensiveMethod); IAsyncResult result = function.BeginInvoke(whenFinished, function); } private void whenFinished(IAsyncResult ar) { var function = ar.AsyncState as Func<int>; int result = function.EndInvoke(ar); resultTextBox.Text = string.Format("The answer is... {0}!", result); }
    7. 7. var numbers = Enumerable.Range(1, 10000000); var query = numbers.AsParallel().Where(n => n.IsPrime()); var primes = query.ToArray(); 1 2 3 4 5 6 7 8 OS Cores1 2 3 4 5 6 7 8 7 3 2 5 2 3 5 7
    8. 8. var customers = Customer.GetSampleCustomers(); Parallel.ForEach(customers, c => { if(!c.IsActive) c.Balance = 0; });
    9. 9. public static List<string> GetNetworkSQLServerInstances() { //Get local network servers return servers; } private void updateServersList(List<string> severs) { listBox1.Items.AddRange(servers.ToArray()); } //SYNC version private void Button1_Click(object sender, EventArgs e) { var servers = GetNetworkSQLServerInstances(); updateServersList(servers); }
    10. 10. public static List<string> GetNetworkSQLServerInstances() { //Get local network servers return servers; } private void updateServersList(List<string> severs) { listBox1.Items.AddRange(servers.ToArray()); } //ASYNC version private void Button1_Click(object sender, EventArgs e) { var serversTask = Task.Factory.StartNew(() => GetNetworkSQLServerInstances()); serversTask.ContinueWith(t => updateServersList(serversTask.Result)); }
    11. 11. public static List<string> GetNetworkSQLServerInstances() { //Get local network servers return servers; } private void updateServersList(List<string> severs) { listBox1.Items.AddRange(servers.ToArray()); } //ASYNC version + context synchronization private void Button1_Click(object sender, EventArgs e) { var serversTask = Task.Factory.StartNew(() => GetNetworkSQLServerInstances()); serversTask.ContinueWith(t => updateServersList(serversTask.Result), TaskScheduler.FromCurrentSynchronizationContext()); }
    12. 12. public static List<string> GetNetworkSQLServerInstances() { //Get local network servers return servers; } private void updateServersList(List<string> severs) { listBox1.Items.AddRange(servers.ToArray()); } //ASYNC/AWAIT version private async void Button1_Click(object sender, EventArgs e) { var servers = await Task.Run(() => GetNetworkSQLServerInstances()); updateServersList(servers); }
    13. 13. public static Task<List<string>> GetNetworkSQLServerInstancesAsync() { //Get local network servers return servers; } //ASYNC/AWAIT version 1 private async void Button1_Click(object sender, EventArgs e) { var servers = await Task.Run(() => GetNetworkSQLServerInstances()); updateServersList(servers); } //ASYNC/AWAIT version 2 (REAL async) private async void Button1_Click(object sender, EventArgs e) { var servers = await GetNetworkSQLServerInstancesAsync(); updateServersList(servers); }
    14. 14.     //ASYNC/AWAIT version private async void Button1_Click(object sender, EventArgs e) { var servers = await GetNetworkSQLServerInstancesAsync(); updateServersList(servers); }
    15. 15. public static void SimpleBody() { Console.WriteLine("Hello, Async World!"); } .method public hidebysig static void SimpleBody() cil managed { .maxstack 8 L_0000: ldstr "Hello, Async World!" L_0005: call void [mscorlib]System.Console::WriteLine(string) L_000a: ret }
    16. 16. public static async Task SimpleBody() { Console.WriteLine("Hello, Async World!"); } .method public hidebysig static class [mscorlib]System.Threading.Tasks.Task SimpleBody() cil managed { .custom instance void [mscorlib]System.Diagnostics.DebuggerStepThroughAttribute::.ctor() = ( 01 00 00 00 ) // Code size 32 (0x20) .maxstack 2 .locals init ([0] valuetype Program/'<SimpleBody>d__0' V_0) IL_0000: ldloca.s V_0 IL_0002: call valuetype [mscorlib]System.Runtime.CompilerServices.AsyncTaskMethodBuilder [mscorlib]System.Runtime.CompilerServices.AsyncTaskMethodBuilder::Create() IL_0007: stfld valuetype [mscorlib]System.Runtime.CompilerServices.AsyncTaskMethodBuilder Program/'<SimpleBody>d__0'::'<>t__builder' IL_000c: ldloca.s V_0 IL_000e: call instance void Program/'<SimpleBody>d__0'::MoveNext() IL_0013: ldloca.s V_0 IL_0015: ldflda valuetype [mscorlib]System.Runtime.CompilerServices.AsyncTaskMethodBuilder Program/'<SimpleBody>d__0'::'<>t__builder' IL_001a: call instance class [mscorlib]System.Threading.Tasks.Task [mscorlib]System.Runtime.CompilerServices.AsyncTaskMethodBuilder::get_Task() IL_001f: ret } .method public hidebysig instance void MoveNext() cil managed { // Code size 66 (0x42) .maxstack 2 .locals init ([0] bool '<>t__doFinallyBodies', [1] class [mscorlib]System.Exception '<>t__ex') .try { IL_0000: ldc.i4.1 IL_0001: stloc.0 IL_0002: ldarg.0 IL_0003: ldfld int32 Program/'<SimpleBody>d__0'::'<>1__state' IL_0008: ldc.i4.m1 IL_0009: bne.un.s IL_000d IL_000b: leave.s IL_0041 IL_000d: ldstr "Hello, Async World!" IL_0012: call void [mscorlib]System.Console::WriteLine(string) IL_0017: leave.s IL_002f } catch [mscorlib]System.Exception { IL_0019: stloc.1 IL_001a: ldarg.0 IL_001b: ldc.i4.m1 IL_001c: stfld int32 Program/'<SimpleBody>d__0'::'<>1__state' IL_0021: ldarg.0 IL_0022: ldflda valuetype [mscorlib]System.Runtime.CompilerServices.AsyncTaskMethodBuilder Program/'<SimpleBody>d__0'::'<>t__builder' IL_0027: ldloc.1 IL_0028: call instance void [mscorlib]System.Runtime.CompilerServices.AsyncTaskMethodBuilder::SetException( class [mscorlib]System.Exception) IL_002d: leave.s IL_0041 } IL_002f: ldarg.0 IL_0030: ldc.i4.m1 IL_0031: stfld int32 Program/'<SimpleBody>d__0'::'<>1__state' IL_0036: ldarg.0 IL_0037: ldflda valuetype [mscorlib]System.Runtime.CompilerServices.AsyncTaskMethodBuilder Program/'<SimpleBody>d__0'::'<>t__builder' IL_003c: call instance void [mscorlib]System.Runtime.CompilerServices.AsyncTaskMethodBuilder::SetResult() IL_0041: ret }
    17. 17. private async void Button1_Click(object Sender, EventArgs e) { try { SendData("https://secure.flickr.com/services/oauth/request_token"); await Task.Delay(2000); DebugPrint("Received Data: " + m_GetResponse); } catch (Exception ex) { rootPage.NotifyUser("Error posting data to server." + ex.Message); } } private async void SendData(string Url) { var request = WebRequest.Create(Url); using (var response = await request.GetResponseAsync()) using (var stream = new StreamReader(response.GetResponseStream())) m_GetResponse = stream.ReadToEnd(); }
    18. 18. private async void Button1_Click(object Sender, EventArgs e) { try { SendData("https://secure.flickr.com/services/oauth/request_token"); // await Task.Delay(2000); // DebugPrint("Received Data: " + m_GetResponse); } catch (Exception ex) { rootPage.NotifyUser("Error posting data to server." + ex.Message); } } private async void SendData(string Url) { var request = WebRequest.Create(Url); using (var response = await request.GetResponseAsync()) // exception on resumption using (var stream = new StreamReader(response.GetResponseStream())) m_GetResponse = stream.ReadToEnd(); }
    19. 19. private async void Button1_Click(object Sender, EventArgs e) { try { SendData("https://secure.flickr.com/services/oauth/request_token"); await Task.Delay(2000); DebugPrint("Received Data: " + m_GetResponse); } catch (Exception ex) { rootPage.NotifyUser("Error posting data to server." + ex.Message); } } private async void SendData(string Url) { var request = WebRequest.Create(Url); using (var response = await request.GetResponseAsync()) using (var stream = new StreamReader(response.GetResponseStream())) m_GetResponse = stream.ReadToEnd(); } Task Async Async await
    20. 20. // Q. It sometimes shows PixelWidth and PixelHeight are both 0 ??? BitmapImage m_bmp; protected override async void OnNavigatedTo(NavigationEventArgs e) { base.OnNavigatedTo(e); await PlayIntroSoundAsync(); image1.Source = m_bmp; Canvas.SetLeft(image1, Window.Current.Bounds.Width - m_bmp.PixelWidth); } protected override async void LoadState(Object nav, Dictionary<String, Object> pageState) { m_bmp = new BitmapImage(); var file = await StorageFile.GetFileFromApplicationUriAsync("ms-appx:///pic.png"); using (var stream = await file.OpenReadAsync()) { await m_bmp.SetSourceAsync(stream); } } class LayoutAwarePage : Page { private string _pageKey; protected override void OnNavigatedTo(NavigationEventArgs e) { if (this._pageKey != null) return; this._pageKey = "Page-" + this.Frame.BackStackDepth; ... this.LoadState(e.Parameter, null); } }
    21. 21. // A. Use a task Task<BitmapImage> m_bmpTask; protected override async void OnNavigatedTo(NavigationEventArgs e) { base.OnNavigatedTo(e); await PlayIntroSoundAsync(); var bmp = await m_bmpTask; image1.Source = bmp; Canvas.SetLeft(image1, Window.Current.Bounds.Width - bmp.PixelWidth); } protected override void LoadState(Object nav, Dictionary<String, Object> pageState) { m_bmpTask = LoadBitmapAsync(); } private async Task<BitmapImage> LoadBitmapAsync() { var bmp = new BitmapImage(); ... return bmp; }
    22. 22. ' In VB, the expression itself determines void- or Task-returning (not the context). Dim void_returning = Async Sub() Await LoadAsync() : m_Result = "done" End Sub Dim task_returning = Async Function() Await LoadAsync() : m_Result = "done" End Function ' If both overloads are offered, you must give it Task-returning. Await Task.Run(Async Function() ... End Function) // In C#, the context determines whether async lambda is void- or Task-returning. Action a1 = async () => { await LoadAsync(); m_Result="done"; }; Func<Task> a2 = async () => { await LoadAsync(); m_Result="done"; }; // Q. Which one will it pick? await Task.Run( async () => { await LoadAsync(); m_Result="done"; }); // A. If both overloads are offered, it will pick Task-returning. Good! class Task { static public Task Run(Action a) {...} static public Task Run(Func<Task> a) {...} ... }
    23. 23. // table1.DataSource = LoadHousesSequentially(1,5); // table1.DataBind(); public List<House> LoadHousesSequentially(int first, int last) { var loadedHouses = new List<House>(); for (int i = first; i <= last; i++) { House house = House.Deserialize(i); loadedHouses.Add(house); } return loadedHouses; } work1 work2 work3 work4 work5
    24. 24. // table1.DataSource = LoadHousesInParallel(1,5); // table1.DataBind(); public List<House> LoadHousesInParallel(int first, int last) { var loadedHouses = new BlockingCollection<House>(); Parallel.For(first, last+1, i => { House house = House.Deserialize(i); loadedHouses.Add(house); }); return loadedHouses.ToList(); } 3 response out 300ms work1 work2 work3 work4 work5 Parallel.For Parallelization hurts Scalability!
    25. 25. end1 start1 end2 start2 end3 start3 end4 start4 end5 start5
    26. 26. 3end3 start3 end1 start1 end2 start2 end5 start5 response out ~200ms Parallel.For end3 start3 end4 start4
    27. 27. end2 start1 start2 start3 start4 start5 response out ~100ms end5 end1 end3 end4
    28. 28. // table1.DataSource = await LoadHousesAsync(1,5); // table1.DataBind(); public async Task<List<House>> LoadHousesAsync(int first, int last) { var tasks = new List<Task<House>>(); for (int i = first; i <= last; i++) { Task<House> t = House.LoadFromDatabaseAsync(i); tasks.Add(t); } House[] loadedHouses = await Task.WhenAll(tasks); return loadedHouses.ToList(); } When… methods minimize awaits + exceptions
    29. 29. public async void btnPayout_Click(object sender, RoutedEventArgs e) { double initialPrice, strikePrice, drift, volatility = from UI double[] prices = new double[252]; double total_payout = 0; for (int i = 0; i < 1000000; i++) { Quant.SimulateStockPrice(prices, initialPrice, drift, volatility); total_payout += Quant.Payout_AsianCallOption(prices, strikePrice); } txtExpectedPayout.Text = (total_payout / 1000000).ToString(); } //Box-Muller technique, generates "standard normal" distribution (mean=0, variance=1) let private NextNormal () = let u1 = RND.NextDouble() let u2 = RND.NextDouble() sqrt(-2.0 * log u1) * sin(2.0 * System.Math.PI * u2) //Geometric Brownian Monion, a common technique to model stock price let SimulateStockPrice (prices:double[], initialPrice, drift, volatility) = let dt = 1.0 float prices.Length let red sim i value = prices.[i] <- value let nextval = value * (1.0 + drift*dt + volatility*NextNormal()*sqrt dt) if i+1 < prices.Length then sim (i+1) (if nextval < 0.0 then 0.0 else nextval) sim 0 inicialprice //An Asian Call Option gives payout if strike price is lower than the average stock price let Payout_Asiancalloption (prices, strikePrice) = let av = Array.average prices max (av - strikePrice) 0.0
    30. 30. public async void btnPayout_Click(object sender, RoutedEventArgs e) { double initialPrice, strikePrice, drift, volatility = from UI var expectedPayout = await Task.Run(() => { double[] prices = new double[252]; double total_payout = 0; for (int i = 0; i < 1000000; i++) { Quant.SimulateStockPrice(prices, initialPrice, drift, volatility); total_payout += Quant.Payout_AsianCallOption(prices, strikePrice); } return total_payout / 1000000; }); txtExpectedPayout.Text = expectedPayout.ToString(); }
    31. 31. public async void btnPayout_Click(object sender, RoutedEventArgs e) { double initialPrice, strikePrice, drift, volatility = from UI IProgress<int> progress = new Progress<int>(i => progressBar1.Value = i); var expectedPayout = await Task.Run(() => { double[] prices = new double[252]; double total_payout = 0; for (int i = 0; i < 1000000; i++) { Quant.SimulateStockPrice(prices, initialPrice, drift, volatility); total_payout += Quant.Payout_AsianCallOption(prices, strikePrice); if(i % 1000 == 0) progress.Report(i); } return total_payout / 1000000; }); txtExpectedPayout.Text = expectedPayout.ToString(); }
    32. 32. Foo(); var task = FooAsync(); ... await task; synchronous perform when it’s done asynchronous initiate immediately
    33. 33. public static void PausePrint2() { Task t = PausePrintAsync(); t.Wait(); } // “I’m not allowed an async signature, // but my underlying library is async” public static Task PausePrint2Async() { return Task.Run(() => PausePrint()); } // “I want to offer an async signature, // but my underlying library is synchronous” public static Task PausePrintAsync() { var tcs = new TaskCompletionSource<bool>(); new Timer(_ => { Console.WriteLine("Hello"); tcs.SetResult(true); }).Change(10000, Timeout.Infinite); return tcs.Task; } public static async Task PausePrintAsync() { await Task.Delay(10000); Console.WriteLine("Hello"); } Synchronous Asynchronous public static void PausePrint() { var end = DateTime.Now + TimeSpan.FromSeconds(10); while (DateTime.Now < end) { } Console.WriteLine("Hello"); } “Should I expose async wrappers for synchronous methods?” – generally no! http://blogs.msdn.com/b/pfxteam/archive/2012/03/24/10287244.aspx “How can I expose sync wrappers for async methods?” – if you absolutely have to, you can use a nested message-loop… http://blogs.msdn.com/b/pfxteam/archive/2012/04/13/10293638.aspx
    34. 34. The threadpool is an app-global resource In a server app, spinning up threads hurts scalability The app is in the best position to manage its threads synchronous blocks the current thread asynchronous without spawning new threads
    35. 35. async Task LoadAsync() { await IO.Network.DownloadAsync(path); } void Button1_Click(){ var t = LoadAsync(); t.Wait(); UpdateView(); } Click Messagepump Task ... DownloadAsync Task ... LoadAsync Download
    36. 36. server scalability.
    37. 37. We all know sync methods are “cheap” public static void SimpleBody() { Console.WriteLine("Hello, Async World!"); } .method public hidebysig static void SimpleBody() cil managed { .maxstack 8 L_0000: ldstr "Hello, Async World!" L_0005: call void [mscorlib]System.Console::WriteLine(string) L_000a: ret }
    38. 38. Not so for asynchronous methods public static async Task SimpleBody() { Console.WriteLine("Hello, Async World!"); } .method public hidebysig static class [mscorlib]System.Threading.Tasks.Task SimpleBody() cil managed { .custom instance void [mscorlib]System.Diagnostics.DebuggerStepThroughAttribute::.ctor() = ( 01 00 00 00 ) // Code size 32 (0x20) .maxstack 2 .locals init ([0] valuetype Program/'<SimpleBody>d__0' V_0) IL_0000: ldloca.s V_0 IL_0002: call valuetype [mscorlib]System.Runtime.CompilerServices.AsyncTaskMethodBuilder [mscorlib]System.Runtime.CompilerServices.AsyncTaskMethodBuilder::Create() IL_0007: stfld valuetype [mscorlib]System.Runtime.CompilerServices.AsyncTaskMethodBuilder Program/'<SimpleBody>d__0'::'<>t__builder' IL_000c: ldloca.s V_0 IL_000e: call instance void Program/'<SimpleBody>d__0'::MoveNext() IL_0013: ldloca.s V_0 IL_0015: ldflda valuetype [mscorlib]System.Runtime.CompilerServices.AsyncTaskMethodBuilder Program/'<SimpleBody>d__0'::'<>t__builder' IL_001a: call instance class [mscorlib]System.Threading.Tasks.Task [mscorlib]System.Runtime.CompilerServices.AsyncTaskMethodBuilder::get_Task() IL_001f: ret } .method public hidebysig instance void MoveNext() cil managed { // Code size 66 (0x42) .maxstack 2 .locals init ([0] bool '<>t__doFinallyBodies', [1] class [mscorlib]System.Exception '<>t__ex') .try { IL_0000: ldc.i4.1 IL_0001: stloc.0 IL_0002: ldarg.0 IL_0003: ldfld int32 Program/'<SimpleBody>d__0'::'<>1__state' IL_0008: ldc.i4.m1 IL_0009: bne.un.s IL_000d IL_000b: leave.s IL_0041 IL_000d: ldstr "Hello, Async World!" IL_0012: call void [mscorlib]System.Console::WriteLine(string) IL_0017: leave.s IL_002f } catch [mscorlib]System.Exception { IL_0019: stloc.1 IL_001a: ldarg.0 IL_001b: ldc.i4.m1 IL_001c: stfld int32 Program/'<SimpleBody>d__0'::'<>1__state' IL_0021: ldarg.0 IL_0022: ldflda valuetype [mscorlib]System.Runtime.CompilerServices.AsyncTaskMethodBuilder Program/'<SimpleBody>d__0'::'<>t__builder' IL_0027: ldloc.1 IL_0028: call instance void [mscorlib]System.Runtime.CompilerServices.AsyncTaskMethodBuilder::SetException( class [mscorlib]System.Exception) IL_002d: leave.s IL_0041 } IL_002f: ldarg.0 IL_0030: ldc.i4.m1 IL_0031: stfld int32 Program/'<SimpleBody>d__0'::'<>1__state' IL_0036: ldarg.0 IL_0037: ldflda valuetype [mscorlib]System.Runtime.CompilerServices.AsyncTaskMethodBuilder Program/'<SimpleBody>d__0'::'<>t__builder' IL_003c: call instance void [mscorlib]System.Runtime.CompilerServices.AsyncTaskMethodBuilder::SetResult() IL_0041: ret }
    39. 39. public static async Task<int> GetNextIntAsync() { if (m_Count == m_Buf.Length) { m_Buf = await FetchNextBufferAsync(); m_Count = 0; } m_Count += 1; return m_Buf[m_Count - 1]; }
    40. 40. var x = await GetNextIntAsync(); var $awaiter = GetNextIntAsync().GetAwaiter(); if (!$awaiter.IsCompleted) { DO THE AWAIT/RETURN AND RESUME; } var x = $awaiter.GetResult();
    41. 41. public static async Task<int> GetNextIntAsync() { if (m_Count == m_Buf.Length) { m_Buf = await FetchNextBufferAsync(); m_Count = 0; } m_Count += 1; return m_Buf[m_Count - 1]; }
    42. 42. The heap is an app-global resource. Like all heap allocations, async allocations can contributing to hurting GC perf.
    43. 43. Sync context represents a “target for work” “Await task” uses the sync context “where you were before” But for library code, it’s rarely needed! You can use “await task.ConfigureAwait(false)” This suppresses step 2; instead if possible it resumes “on the thread that completed the task” Result: slightly better performance. Also can avoid deadlock if a badly-written user blocks.
    44. 44. UI responsiveness
    45. 45. Lluis Franco & Alex Casquete Async best practices ¡¡¡Si te ha gustado no olvides rellenar la encuesta!!! Thanks Y A X B

    ×