mirror of
https://github.com/RetroDECK/ES-DE.git
synced 2025-02-16 12:05:38 +00:00
Fixed a race condition for the libcurl cleanup which led to a memory leak.
This commit is contained in:
parent
5838481e0d
commit
82759fb2ce
|
@ -127,6 +127,22 @@ GuiScraperSearch::GuiScraperSearch(
|
|||
|
||||
GuiScraperSearch::~GuiScraperSearch()
|
||||
{
|
||||
// The following manual resets are required to avoid a race condition when the
|
||||
// STOP button is pressed in the multi-scraper. Without this code there will be
|
||||
// a memory leak as the cURL easy handle is not cleaned up. For a normally completed
|
||||
// scraping however, the destructor will already have been called in HttpReq.
|
||||
if (mSearchHandle)
|
||||
mSearchHandle.reset();
|
||||
|
||||
if (mMDRetrieveURLsHandle)
|
||||
mMDRetrieveURLsHandle.reset();
|
||||
|
||||
if (mMDResolveHandle)
|
||||
mMDResolveHandle.reset();
|
||||
|
||||
if (mThumbnailReq)
|
||||
mThumbnailReq.reset();
|
||||
|
||||
HttpReq::cleanupCurlMulti();
|
||||
}
|
||||
|
||||
|
|
Loading…
Reference in a new issue