Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WIP. More Efficient MemoryAllocator #1660

Closed
wants to merge 30 commits into from
Closed
Show file tree
Hide file tree
Changes from 1 commit
Commits
Show all changes
30 commits
Select commit Hold shift + click to select a range
8da4092
Add GC handling, use shared, simplify pool
JimBobSquarePants Jun 12, 2021
1a0ce6f
Only assign reference if using the large pool
JimBobSquarePants Jun 12, 2021
407092b
Fix null ref
JimBobSquarePants Jun 12, 2021
ad96db7
Introduce a timer that performs a callback to cleanup
JimBobSquarePants Jun 12, 2021
6306567
Pass allocator directly
JimBobSquarePants Jun 13, 2021
a9ee629
Update ArrayPoolMemoryAllocator.Buffer{T}.cs
JimBobSquarePants Jun 13, 2021
6036a39
Use GC Aware configurable pool.
JimBobSquarePants Jun 13, 2021
c860447
Simplify and use 2MB buffers
JimBobSquarePants Jun 13, 2021
e9105d3
Fix iterator tests
JimBobSquarePants Jun 13, 2021
4feb673
Add unmanaged buffer implementation
JimBobSquarePants Jun 14, 2021
0de66a4
Cleanup and fix warnings
JimBobSquarePants Jun 14, 2021
f9395fd
Fix everything but Tiff
JimBobSquarePants Jun 14, 2021
5e51795
Merge branch 'master' into js/memory-experiments
JimBobSquarePants Jun 14, 2021
801e887
Merge branch 'master' into js/memory-experiments
JimBobSquarePants Jun 14, 2021
06f9557
Merge branch 'master' into js/memory-experiments
JimBobSquarePants Jun 14, 2021
0aa3211
Better naming, fix last failing test
JimBobSquarePants Jun 14, 2021
73bd60a
Remove factory methods
JimBobSquarePants Jun 14, 2021
f77923a
Merge branch 'master' into js/memory-experiments
JimBobSquarePants Jun 15, 2021
ce668dd
Clean up utilities and namespaces.
JimBobSquarePants Jun 15, 2021
65d2a28
Remove IManagedByteBuffer
JimBobSquarePants Jun 15, 2021
88adace
Fix all but 2 bitmap decode tests
JimBobSquarePants Jun 15, 2021
9c51119
Fix bmp tests, re-enable dither tests
JimBobSquarePants Jun 15, 2021
4c4f9d3
Merge branch 'master' into js/memory-experiments
JimBobSquarePants Jun 16, 2021
0e01d1b
Skip failing test on non windows
JimBobSquarePants Jun 16, 2021
b18c871
Skip another assertation on UNIX
JimBobSquarePants Jun 16, 2021
e9e1948
Update ImageSharp.sln
JimBobSquarePants Jun 16, 2021
65c67d3
Smarter pixel map reuse.
JimBobSquarePants Jun 16, 2021
a22b0cb
Skip flaky test
JimBobSquarePants Jun 16, 2021
f01bc0a
Fix first ticks.
JimBobSquarePants Jun 17, 2021
547614a
Skip flaky test
JimBobSquarePants Jun 17, 2021
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 10 additions & 0 deletions src/ImageSharp/Memory/Allocators/ArrayPoolMemoryAllocator.cs
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@
using System.Buffers;
using System.Runtime.CompilerServices;
using SixLabors.ImageSharp.Memory.Allocators.Internals;
using SixLabors.ImageSharp.Memory.Internals;

namespace SixLabors.ImageSharp.Memory
{
Expand Down Expand Up @@ -116,6 +117,15 @@ public override IMemoryOwner<T> Allocate<T>(int length, AllocationOptions option

int itemSizeBytes = Unsafe.SizeOf<T>();
int bufferSizeInBytes = length * itemSizeBytes;

// For anything greater than our pool limit defer to unmanaged memory
// to prevent LOH fragmentation.
if (bufferSizeInBytes > this.MaxPoolSizeInBytes)
{
return new UnmanagedBuffer<T>(bufferSizeInBytes);
}

// Safe to pool.
ArrayPool<byte> pool = this.GetArrayPool(bufferSizeInBytes);
byte[] byteArray = pool.Rent(bufferSizeInBytes);

Expand Down
94 changes: 94 additions & 0 deletions src/ImageSharp/Memory/Allocators/Internals/UnmanagedBuffer{T}.cs
Original file line number Diff line number Diff line change
@@ -0,0 +1,94 @@
// Copyright (c) Six Labors.
// Licensed under the Apache License, Version 2.0.

using System;
using System.Buffers;
using System.Runtime.CompilerServices;
using System.Runtime.InteropServices;

namespace SixLabors.ImageSharp.Memory.Internals
{
/// <summary>
/// Allocates and provides an <see cref="IMemoryOwner{T}"/> implementation giving
/// access to unmanaged buffers.
/// </summary>
/// <typeparam name="T">The element type.</typeparam>
internal sealed class UnmanagedBuffer<T> : MemoryManager<T>
where T : struct
{
private bool isDisposed;
private readonly SafeHandle safeHandle;
private readonly int length;

/// <summary>
/// Initializes a new instance of the <see cref="UnmanagedBuffer{T}"/> class.
/// </summary>
/// <param name="byteCount">The number of bytes to allocate.</param>
public UnmanagedBuffer(int byteCount)
{
this.length = byteCount / Unsafe.SizeOf<T>();
this.safeHandle = new SafeHGlobalHandle(byteCount);
}

public override unsafe Span<T> GetSpan()
=> new Span<T>((void*)this.safeHandle.DangerousGetHandle(), this.length);

/// <inheritdoc />
public override unsafe MemoryHandle Pin(int elementIndex = 0)
{
// Will be released in Unpin
bool unused = false;
this.safeHandle.DangerousAddRef(ref unused);

void* pbData = Unsafe.Add<T>((void*)this.safeHandle.DangerousGetHandle(), elementIndex);
return new MemoryHandle(pbData);
}

/// <inheritdoc />
public override void Unpin() => this.safeHandle.DangerousRelease();

/// <inheritdoc />
protected override void Dispose(bool disposing)
{
if (this.isDisposed || this.safeHandle.IsInvalid)
{
return;
}

if (disposing)
{
this.safeHandle.Dispose();
}

this.isDisposed = true;
}

private sealed class SafeHGlobalHandle : SafeHandle
{
private readonly int byteCount;

public SafeHGlobalHandle(int size)
: base(IntPtr.Zero, true)
{
this.SetHandle(Marshal.AllocHGlobal(size));
GC.AddMemoryPressure(this.byteCount = size);
Copy link
Contributor

@br3aker br3aker Jun 14, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this be done? This would lead to a faster gen2 collection and object itself would behave like it's a managed allocation which I assume is not what we want to be happening for large chunks of data.

Microsoft docs states that "The AddMemoryPressure and RemoveMemoryPressure methods improve performance only for types that exclusively depend on finalizers to release the unmanaged resources". Which is not the case here as safe handle is used to prevent memory leaking only when user forgot to call dispose which is very unlikely to be caused by library code (this should be tested somehow to be honest).

Otherwise this would lead to a memory throttle as it is with current implementation.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That advice is awkwardly worded. In the case where a user fails to Dispose an object backed by an unmanaged buffer, it does rely on the SafeHandle's finalizer to release the memory, so it is advantageous to advise the GC that it might be able to reclaim the memory by doing its GC thing. This should only result in more frequent gen2 GCs if the memory limits are being reached, which should only happen if the unmanaged buffer is long-lived or if the system is actually low on memory. For properly Disposed ephemeral buffers, the GC will be aware of the added memory pressure but will also see it freed quickly, so no harm.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure that these buffers can be named 'ephemeral', 2mb is a lot (at least it looks like a lot for something temporal) which should lead to a lot of execution time spent on this memory. Primary idea I had while writing this is that even if user forgets to dispose memory backed by a handle it'll be freed by either gen0/gen1 collection or full gen2 collection when OutOfMemoryException kicks in.

While I'm not so sure about this now, I'm still concerned that this might be an overkill due to freachable queue. At least I'd want to benchmark this on something big like your parallel beeheads demo.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

unmanaged buffers will only be used for allocations that are over the pooled size limit. They should be used once and then released, hence 'ephemeral'. The question of whether GC.AddMemoryPressure is appropriate ultimately comes down to "is there any way a GC could free this memory?". The answer here is yes, but only if the memory has been leaked because someone didn't dispose. Unfortunately there's no way to know whether someone is going to leak; you can only tell after they've done it, when your finalizer runs.

What the MS docs should have said is something along the lines of "don't tell the GC about memory it couldn't possibly reclaim". If the allocation and free were always 100% deterministic, there would be no point in telling the GC about them. But since we can't know whether the GC could reclaim the memory, it's better to tell it about the allocation than to not.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Once again, thanks for a deep dive! I've never looked at GC from "memory it can or cannot not claim" point of view.

}

public override bool IsInvalid => this.handle == IntPtr.Zero;

protected override bool ReleaseHandle()
{
if (this.IsInvalid)
{
return false;
}

Marshal.FreeHGlobal(this.handle);
GC.RemoveMemoryPressure(this.byteCount);
this.handle = IntPtr.Zero;

return true;
}
}
}
}