Fixed a race condition for the libcurl cleanup which led to a memory leak.

This commit is contained in:
Leon Styhre 2020-10-18 11:41:36 +02:00
parent 5838481e0d
commit 82759fb2ce

View file

@ -127,6 +127,22 @@ GuiScraperSearch::GuiScraperSearch(
GuiScraperSearch::~GuiScraperSearch()
{
// The following manual resets are required to avoid a race condition when the
// STOP button is pressed in the multi-scraper. Without this code there will be
// a memory leak as the cURL easy handle is not cleaned up. For a normally completed
// scraping however, the destructor will already have been called in HttpReq.
if (mSearchHandle)
mSearchHandle.reset();
if (mMDRetrieveURLsHandle)
mMDRetrieveURLsHandle.reset();
if (mMDResolveHandle)
mMDResolveHandle.reset();
if (mThumbnailReq)
mThumbnailReq.reset();
HttpReq::cleanupCurlMulti();
}