

Var cookieContainer = new CookieContainer() ĬookieContainer.Add(new Cookie(cookie.Name, cookie.Value, cookie.Path, cookie.Domain)) Populate the Cookie Container like this: private CookieContainer BuildCookieContainer(IEnumerable cookies) NEED THIS TIMEOUT TO KEEP THE BROWSER OPEN WHILE THE FILE IS DOWNLOADING!Īwait page.WaitForTimeoutAsync(1000 * configs.DownloadDurationEstimateInSeconds) Var cookieContainer = BuildCookieContainer(pageCookies) Īwait DownloadFileRequiringHeadersAndCookies(getUrl, fullPath, cookieContainer, cancellationToken) Īwait page.ClickAsync("button") Var pageCookies = await page.GetCookiesAsync() Add the cookies to a container for the upcoming Download GET request Handle the response with the Excel download Page.Response += async (sender, responseCreatedEventArgs) => Handle multiple responses and process the Download await using (var browser = await Puppeteer.LaunchAsync(new LaunchOptions ))Īwait using (var page = await browser.NewPageAsync())

Once I had that particular response, I had to attach headers and cookies for the remote server to send the downloadable data in the response. In essence, before the button click, I had to process multiple responses and handle a single response with the download.

I needed both Headers and Cookies set before the download would start. Another solution I tried was manually removing the HeadlessChrome substring from the userAgent in case the site was blocking it, but that didnt work either.I had a more difficult variation of this, using Puppeteer Sharp. My research suggests that the browser is closing before the download completes possibly? Ive added a wait of about 15 seconds, which is much longer than it should need to download the file, but still not getting anything. Im able to run it with no problems while headless: false, but when headless:true, the file does not download. Im running a script that logs into an authenticated session on a website and clicks a button to download an excel file.
