[JSC] BlockDirectory's bits should be compact
authorysuzuki@apple.com <ysuzuki@apple.com@268f45cc-cd09-0410-ab3c-d52691b4dbfc>
Thu, 14 Nov 2019 09:37:12 +0000 (09:37 +0000)
committerysuzuki@apple.com <ysuzuki@apple.com@268f45cc-cd09-0410-ab3c-d52691b4dbfc>
Thu, 14 Nov 2019 09:37:12 +0000 (09:37 +0000)
https://bugs.webkit.org/show_bug.cgi?id=204149

Reviewed by Robin Morisset.

Source/JavaScriptCore:

We start applying IsoSubspace to all JSCells. This means that IsoSubspace should be small enough,
so that we can hold many IsoSubspaces without considering about memory regression.

In this patch, we introduce several things to shrink sizeof(IsoSubspace) from 528 to 384.

1. Adjusting members to remove some paddings.
2. Remove m_heap field since this can be got from the caller easily.
3. Make MarkedSpace::heap() efficient: just doing pointer arithmetic.
4. Remove m_size field from IsoSubspace since BlockDirectory knows cellSize.
5. Introduce BlockDirectoryBits, which repalces 9 FastBitVector in BlockDirectory to this one class.
   Since all FastBitVector has the same size, we should not have a size field for each FastBitVector.
   We reuse FastBitVector's View mechanism to keep the same ergonomics while making BlockDirectoryBits
   much smaller. We put 9 uint32_t as Segment, and manage Vector<Segment> in this data structure. Since
   we touch several bits at the same time for the same block-index, this data structure is compact and
   efficient.

* CMakeLists.txt:
* JavaScriptCore.xcodeproj/project.pbxproj:
* heap/AlignedMemoryAllocator.cpp:
(JSC::AlignedMemoryAllocator::registerDirectory):
* heap/AlignedMemoryAllocator.h:
* heap/Allocator.h:
* heap/AllocatorInlines.h:
(JSC::Allocator::allocate const):
* heap/BlockDirectory.cpp:
(JSC::BlockDirectory::BlockDirectory):
(JSC::BlockDirectory::findEmptyBlockToSteal):
(JSC::BlockDirectory::findBlockForAllocation):
(JSC::BlockDirectory::tryAllocateBlock):
(JSC::BlockDirectory::addBlock):
(JSC::BlockDirectory::removeBlock):
(JSC::BlockDirectory::prepareForAllocation):
(JSC::BlockDirectory::beginMarkingForFullCollection):
(JSC::BlockDirectory::endMarking):
(JSC::BlockDirectory::snapshotUnsweptForEdenCollection):
(JSC::BlockDirectory::snapshotUnsweptForFullCollection):
(JSC::BlockDirectory::findBlockToSweep):
(JSC::BlockDirectory::sweep):
(JSC::BlockDirectory::shrink):
(JSC::BlockDirectory::assertNoUnswept):
(JSC::BlockDirectory::parallelNotEmptyBlockSource):
(JSC::BlockDirectory::dumpBits):
* heap/BlockDirectory.h:
(JSC::BlockDirectory::cellKind const):
(JSC::BlockDirectory::forEachBitVector):
(JSC::BlockDirectory::forEachBitVectorWithName):
(JSC::BlockDirectory::heap): Deleted.
* heap/BlockDirectoryBits.h: Added.
(JSC::BlockDirectoryBits::BlockDirectoryBitVectorWordView::BlockDirectoryBitVectorWordView):
(JSC::BlockDirectoryBits::BlockDirectoryBitVectorWordView::numBits const):
(JSC::BlockDirectoryBits::BlockDirectoryBitVectorWordView::word const):
(JSC::BlockDirectoryBits::BlockDirectoryBitVectorWordView::word):
(JSC::BlockDirectoryBits::BlockDirectoryBitVectorWordView::clearAll):
(JSC::BlockDirectoryBits::BlockDirectoryBitVectorWordView::view const):
(JSC::BlockDirectoryBits::numBits const):
(JSC::BlockDirectoryBits::resize):
(JSC::BlockDirectoryBits::forEachSegment):
* heap/BlockDirectoryInlines.h:
(JSC::BlockDirectory::forEachBlock):
(JSC::BlockDirectory::forEachNotEmptyBlock):
* heap/CompleteSubspace.cpp:
(JSC::CompleteSubspace::allocatorForSlow):
(JSC::CompleteSubspace::tryAllocateSlow):
* heap/CompleteSubspaceInlines.h:
(JSC::CompleteSubspace::allocateNonVirtual):
* heap/IsoCellSet.cpp:
(JSC::IsoCellSet::parallelNotEmptyMarkedBlockSource):
* heap/IsoCellSetInlines.h:
(JSC::IsoCellSet::forEachMarkedCell):
* heap/IsoSubspace.cpp:
(JSC::IsoSubspace::IsoSubspace):
(JSC::IsoSubspace::tryAllocateFromLowerTier):
* heap/IsoSubspace.h:
(JSC::IsoSubspace::cellSize):
(JSC::IsoSubspace::allocatorForNonVirtual):
(JSC::IsoSubspace::size const): Deleted.
(): Deleted.
* heap/IsoSubspaceInlines.h:
(JSC::IsoSubspace::allocateNonVirtual):
* heap/IsoSubspacePerVM.cpp:
(JSC::IsoSubspacePerVM::AutoremovingIsoSubspace::~AutoremovingIsoSubspace):
* heap/LocalAllocator.cpp:
(JSC::LocalAllocator::allocateSlowCase):
(JSC::LocalAllocator::doTestCollectionsIfNeeded):
* heap/LocalAllocator.h:
* heap/LocalAllocatorInlines.h:
(JSC::LocalAllocator::allocate):
* heap/MarkedBlock.cpp:
(JSC::MarkedBlock::Handle::dumpState):
* heap/MarkedSpace.cpp:
(JSC::MarkedSpace::MarkedSpace):
(JSC::MarkedSpace::sweepBlocks):
(JSC::MarkedSpace::prepareForAllocation):
(JSC::MarkedSpace::visitWeakSets):
(JSC::MarkedSpace::reapWeakSets):
(JSC::MarkedSpace::prepareForMarking):
(JSC::MarkedSpace::beginMarking):
(JSC::MarkedSpace::snapshotUnswept):
* heap/MarkedSpace.h:
(JSC::MarkedSpace::heap const): Deleted.
* heap/MarkedSpaceInlines.h:
(JSC::MarkedSpace::heap const):
* heap/Subspace.cpp:
(JSC::Subspace::initialize):
* heap/Subspace.h:

Source/WTF:

* wtf/FastBitVector.h:
(WTF::fastBitVectorArrayLength):
(WTF::FastBitVectorImpl::unsafeWords):
(WTF::FastBitVectorImpl::unsafeWords const):
(WTF::FastBitReference::FastBitReference):
(WTF::FastBitReference::operator bool const):
(WTF::FastBitReference::operator=):
(WTF::FastBitVector::at):
(WTF::FastBitVector::operator[]):
(WTF::FastBitVector::BitReference::BitReference): Deleted.
(WTF::FastBitVector::BitReference::operator bool const): Deleted.
(WTF::FastBitVector::BitReference::operator=): Deleted.

git-svn-id: https://svn.webkit.org/repository/webkit/trunk@252452 268f45cc-cd09-0410-ab3c-d52691b4dbfc

30 files changed:
Source/JavaScriptCore/CMakeLists.txt
Source/JavaScriptCore/ChangeLog
Source/JavaScriptCore/JavaScriptCore.xcodeproj/project.pbxproj
Source/JavaScriptCore/heap/AlignedMemoryAllocator.cpp
Source/JavaScriptCore/heap/AlignedMemoryAllocator.h
Source/JavaScriptCore/heap/Allocator.h
Source/JavaScriptCore/heap/AllocatorInlines.h
Source/JavaScriptCore/heap/BlockDirectory.cpp
Source/JavaScriptCore/heap/BlockDirectory.h
Source/JavaScriptCore/heap/BlockDirectoryBits.h [new file with mode: 0644]
Source/JavaScriptCore/heap/BlockDirectoryInlines.h
Source/JavaScriptCore/heap/CompleteSubspace.cpp
Source/JavaScriptCore/heap/CompleteSubspaceInlines.h
Source/JavaScriptCore/heap/IsoCellSet.cpp
Source/JavaScriptCore/heap/IsoCellSetInlines.h
Source/JavaScriptCore/heap/IsoSubspace.cpp
Source/JavaScriptCore/heap/IsoSubspace.h
Source/JavaScriptCore/heap/IsoSubspaceInlines.h
Source/JavaScriptCore/heap/IsoSubspacePerVM.cpp
Source/JavaScriptCore/heap/LocalAllocator.cpp
Source/JavaScriptCore/heap/LocalAllocator.h
Source/JavaScriptCore/heap/LocalAllocatorInlines.h
Source/JavaScriptCore/heap/MarkedBlock.cpp
Source/JavaScriptCore/heap/MarkedSpace.cpp
Source/JavaScriptCore/heap/MarkedSpace.h
Source/JavaScriptCore/heap/MarkedSpaceInlines.h
Source/JavaScriptCore/heap/Subspace.cpp
Source/JavaScriptCore/heap/Subspace.h
Source/WTF/ChangeLog
Source/WTF/wtf/FastBitVector.h

index dceb3fc..b6f0aed 100644 (file)
@@ -569,6 +569,7 @@ set(JavaScriptCore_PRIVATE_FRAMEWORK_HEADERS
     heap/AllocatorInlines.h
     heap/AllocatorForMode.h
     heap/BlockDirectory.h
+    heap/BlockDirectoryBits.h
     heap/BlockDirectoryInlines.h
     heap/CellAttributes.h
     heap/CellContainer.h
index 6ec7b5c..f676ddc 100644 (file)
@@ -1,3 +1,116 @@
+2019-11-14  Yusuke Suzuki  <ysuzuki@apple.com>
+
+        [JSC] BlockDirectory's bits should be compact
+        https://bugs.webkit.org/show_bug.cgi?id=204149
+
+        Reviewed by Robin Morisset.
+
+        We start applying IsoSubspace to all JSCells. This means that IsoSubspace should be small enough,
+        so that we can hold many IsoSubspaces without considering about memory regression.
+
+        In this patch, we introduce several things to shrink sizeof(IsoSubspace) from 528 to 384.
+
+        1. Adjusting members to remove some paddings.
+        2. Remove m_heap field since this can be got from the caller easily.
+        3. Make MarkedSpace::heap() efficient: just doing pointer arithmetic.
+        4. Remove m_size field from IsoSubspace since BlockDirectory knows cellSize.
+        5. Introduce BlockDirectoryBits, which repalces 9 FastBitVector in BlockDirectory to this one class.
+           Since all FastBitVector has the same size, we should not have a size field for each FastBitVector.
+           We reuse FastBitVector's View mechanism to keep the same ergonomics while making BlockDirectoryBits
+           much smaller. We put 9 uint32_t as Segment, and manage Vector<Segment> in this data structure. Since
+           we touch several bits at the same time for the same block-index, this data structure is compact and
+           efficient.
+
+        * CMakeLists.txt:
+        * JavaScriptCore.xcodeproj/project.pbxproj:
+        * heap/AlignedMemoryAllocator.cpp:
+        (JSC::AlignedMemoryAllocator::registerDirectory):
+        * heap/AlignedMemoryAllocator.h:
+        * heap/Allocator.h:
+        * heap/AllocatorInlines.h:
+        (JSC::Allocator::allocate const):
+        * heap/BlockDirectory.cpp:
+        (JSC::BlockDirectory::BlockDirectory):
+        (JSC::BlockDirectory::findEmptyBlockToSteal):
+        (JSC::BlockDirectory::findBlockForAllocation):
+        (JSC::BlockDirectory::tryAllocateBlock):
+        (JSC::BlockDirectory::addBlock):
+        (JSC::BlockDirectory::removeBlock):
+        (JSC::BlockDirectory::prepareForAllocation):
+        (JSC::BlockDirectory::beginMarkingForFullCollection):
+        (JSC::BlockDirectory::endMarking):
+        (JSC::BlockDirectory::snapshotUnsweptForEdenCollection):
+        (JSC::BlockDirectory::snapshotUnsweptForFullCollection):
+        (JSC::BlockDirectory::findBlockToSweep):
+        (JSC::BlockDirectory::sweep):
+        (JSC::BlockDirectory::shrink):
+        (JSC::BlockDirectory::assertNoUnswept):
+        (JSC::BlockDirectory::parallelNotEmptyBlockSource):
+        (JSC::BlockDirectory::dumpBits):
+        * heap/BlockDirectory.h:
+        (JSC::BlockDirectory::cellKind const):
+        (JSC::BlockDirectory::forEachBitVector):
+        (JSC::BlockDirectory::forEachBitVectorWithName):
+        (JSC::BlockDirectory::heap): Deleted.
+        * heap/BlockDirectoryBits.h: Added.
+        (JSC::BlockDirectoryBits::BlockDirectoryBitVectorWordView::BlockDirectoryBitVectorWordView):
+        (JSC::BlockDirectoryBits::BlockDirectoryBitVectorWordView::numBits const):
+        (JSC::BlockDirectoryBits::BlockDirectoryBitVectorWordView::word const):
+        (JSC::BlockDirectoryBits::BlockDirectoryBitVectorWordView::word):
+        (JSC::BlockDirectoryBits::BlockDirectoryBitVectorWordView::clearAll):
+        (JSC::BlockDirectoryBits::BlockDirectoryBitVectorWordView::view const):
+        (JSC::BlockDirectoryBits::numBits const):
+        (JSC::BlockDirectoryBits::resize):
+        (JSC::BlockDirectoryBits::forEachSegment):
+        * heap/BlockDirectoryInlines.h:
+        (JSC::BlockDirectory::forEachBlock):
+        (JSC::BlockDirectory::forEachNotEmptyBlock):
+        * heap/CompleteSubspace.cpp:
+        (JSC::CompleteSubspace::allocatorForSlow):
+        (JSC::CompleteSubspace::tryAllocateSlow):
+        * heap/CompleteSubspaceInlines.h:
+        (JSC::CompleteSubspace::allocateNonVirtual):
+        * heap/IsoCellSet.cpp:
+        (JSC::IsoCellSet::parallelNotEmptyMarkedBlockSource):
+        * heap/IsoCellSetInlines.h:
+        (JSC::IsoCellSet::forEachMarkedCell):
+        * heap/IsoSubspace.cpp:
+        (JSC::IsoSubspace::IsoSubspace):
+        (JSC::IsoSubspace::tryAllocateFromLowerTier):
+        * heap/IsoSubspace.h:
+        (JSC::IsoSubspace::cellSize):
+        (JSC::IsoSubspace::allocatorForNonVirtual):
+        (JSC::IsoSubspace::size const): Deleted.
+        (): Deleted.
+        * heap/IsoSubspaceInlines.h:
+        (JSC::IsoSubspace::allocateNonVirtual):
+        * heap/IsoSubspacePerVM.cpp:
+        (JSC::IsoSubspacePerVM::AutoremovingIsoSubspace::~AutoremovingIsoSubspace):
+        * heap/LocalAllocator.cpp:
+        (JSC::LocalAllocator::allocateSlowCase):
+        (JSC::LocalAllocator::doTestCollectionsIfNeeded):
+        * heap/LocalAllocator.h:
+        * heap/LocalAllocatorInlines.h:
+        (JSC::LocalAllocator::allocate):
+        * heap/MarkedBlock.cpp:
+        (JSC::MarkedBlock::Handle::dumpState):
+        * heap/MarkedSpace.cpp:
+        (JSC::MarkedSpace::MarkedSpace):
+        (JSC::MarkedSpace::sweepBlocks):
+        (JSC::MarkedSpace::prepareForAllocation):
+        (JSC::MarkedSpace::visitWeakSets):
+        (JSC::MarkedSpace::reapWeakSets):
+        (JSC::MarkedSpace::prepareForMarking):
+        (JSC::MarkedSpace::beginMarking):
+        (JSC::MarkedSpace::snapshotUnswept):
+        * heap/MarkedSpace.h:
+        (JSC::MarkedSpace::heap const): Deleted.
+        * heap/MarkedSpaceInlines.h:
+        (JSC::MarkedSpace::heap const):
+        * heap/Subspace.cpp:
+        (JSC::Subspace::initialize):
+        * heap/Subspace.h:
+
 2019-11-13  Robin Morisset  <rmorisset@apple.com>
 
         Split ArithProfile into a Unary and a Binary version
index 4d27d98..363bef4 100644 (file)
                E36CC9472086314F0051FFD6 /* WasmCreationMode.h in Headers */ = {isa = PBXBuildFile; fileRef = E36CC9462086314F0051FFD6 /* WasmCreationMode.h */; settings = {ATTRIBUTES = (Private, ); }; };
                E3794E761B77EB97005543AE /* ModuleAnalyzer.h in Headers */ = {isa = PBXBuildFile; fileRef = E3794E741B77EB97005543AE /* ModuleAnalyzer.h */; settings = {ATTRIBUTES = (Private, ); }; };
                E3850B15226ED641009ABF9C /* DFGMinifiedIDInlines.h in Headers */ = {isa = PBXBuildFile; fileRef = E3850B14226ED63E009ABF9C /* DFGMinifiedIDInlines.h */; };
+               E38652E3237CA0C900E1D5EE /* BlockDirectoryBits.h in Headers */ = {isa = PBXBuildFile; fileRef = E38652E2237CA0C800E1D5EE /* BlockDirectoryBits.h */; settings = {ATTRIBUTES = (Private, ); }; };
                E3893A1D2203A7C600E79A74 /* AsyncFromSyncIteratorPrototype.lut.h in Headers */ = {isa = PBXBuildFile; fileRef = E3893A1C2203A7C600E79A74 /* AsyncFromSyncIteratorPrototype.lut.h */; };
                E38D999C221B78BB00D50474 /* JSNonDestructibleProxy.h in Headers */ = {isa = PBXBuildFile; fileRef = E38D999A221B789F00D50474 /* JSNonDestructibleProxy.h */; settings = {ATTRIBUTES = (Private, ); }; };
                E39006212208BFC4001019CF /* SubspaceAccess.h in Headers */ = {isa = PBXBuildFile; fileRef = E39006202208BFC3001019CF /* SubspaceAccess.h */; settings = {ATTRIBUTES = (Private, ); }; };
                E380A76B1DCD7195000F89E6 /* MacroAssemblerHelpers.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = MacroAssemblerHelpers.h; sourceTree = "<group>"; };
                E380D66B1F19249D00A59095 /* BuiltinNames.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = BuiltinNames.cpp; sourceTree = "<group>"; };
                E3850B14226ED63E009ABF9C /* DFGMinifiedIDInlines.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; name = DFGMinifiedIDInlines.h; path = dfg/DFGMinifiedIDInlines.h; sourceTree = "<group>"; };
+               E38652E2237CA0C800E1D5EE /* BlockDirectoryBits.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = BlockDirectoryBits.h; sourceTree = "<group>"; };
                E3893A1C2203A7C600E79A74 /* AsyncFromSyncIteratorPrototype.lut.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = AsyncFromSyncIteratorPrototype.lut.h; sourceTree = "<group>"; };
                E38D060B1F8E814100649CF2 /* JSScriptFetchParameters.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = JSScriptFetchParameters.h; sourceTree = "<group>"; };
                E38D060C1F8E814100649CF2 /* ScriptFetchParameters.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = ScriptFetchParameters.h; sourceTree = "<group>"; };
                                0FB4677E1FDDA6E5003FCB09 /* AtomIndices.h */,
                                C2B916C414DA040C00CBAC86 /* BlockDirectory.cpp */,
                                C2B916C114DA014E00CBAC86 /* BlockDirectory.h */,
+                               E38652E2237CA0C800E1D5EE /* BlockDirectoryBits.h */,
                                0F7DF1451E2BEF680095951B /* BlockDirectoryInlines.h */,
                                0F9630351D4192C3005609D9 /* CellAttributes.cpp */,
                                0F9630361D4192C3005609D9 /* CellAttributes.h */,
                                86976E5E1FA3E8B600E7C4E1 /* BigIntPrototype.h in Headers */,
                                0F64B2721A784BAF006E4E66 /* BinarySwitch.h in Headers */,
                                C2B916C214DA014E00CBAC86 /* BlockDirectory.h in Headers */,
+                               E38652E3237CA0C900E1D5EE /* BlockDirectoryBits.h in Headers */,
                                0F7DF1461E2BEF6A0095951B /* BlockDirectoryInlines.h in Headers */,
                                BC18C3EC0E16F5CD00B34460 /* BooleanObject.h in Headers */,
                                9B4694391F97439E00CCB3F9 /* BooleanPrototype.h in Headers */,
index e7c93c2..ea32df3 100644 (file)
@@ -40,12 +40,12 @@ AlignedMemoryAllocator::~AlignedMemoryAllocator()
 {
 }
 
-void AlignedMemoryAllocator::registerDirectory(BlockDirectory* directory)
+void AlignedMemoryAllocator::registerDirectory(Heap& heap, BlockDirectory* directory)
 {
     RELEASE_ASSERT(!directory->nextDirectoryInAlignedMemoryAllocator());
     
     if (m_directories.isEmpty()) {
-        ASSERT(!Thread::mayBeGCThread() || directory->heap()->worldIsStopped());
+        ASSERT_UNUSED(heap, !Thread::mayBeGCThread() || heap.worldIsStopped());
         for (Subspace* subspace = m_subspaces.first(); subspace; subspace = subspace->nextSubspaceInAlignedMemoryAllocator())
             subspace->didCreateFirstDirectory(directory);
     }
index 66442ad..9566963 100644 (file)
@@ -31,6 +31,7 @@
 namespace JSC {
 
 class BlockDirectory;
+class Heap;
 class Subspace;
 
 class AlignedMemoryAllocator {
@@ -45,7 +46,7 @@ public:
     
     virtual void dump(PrintStream&) const = 0;
 
-    void registerDirectory(BlockDirectory*);
+    void registerDirectory(Heap&, BlockDirectory*);
     BlockDirectory* firstDirectory() const { return m_directories.first(); }
 
     void registerSubspace(Subspace*);
index 229158f..7239d22 100644 (file)
@@ -31,6 +31,7 @@
 namespace JSC {
 
 class GCDeferralContext;
+class Heap;
 class LocalAllocator;
 
 // This abstracts how we refer to LocalAllocator so that we could eventually support thread-local
@@ -45,7 +46,7 @@ public:
     {
     }
     
-    void* allocate(GCDeferralContext*, AllocationFailureMode) const;
+    void* allocate(Heap&, GCDeferralContext*, AllocationFailureMode) const;
     
     unsigned cellSize() const;
     
index 1757327..695b089 100644 (file)
@@ -30,9 +30,9 @@
 
 namespace JSC {
 
-ALWAYS_INLINE void* Allocator::allocate(GCDeferralContext* context, AllocationFailureMode mode) const
+ALWAYS_INLINE void* Allocator::allocate(Heap& heap, GCDeferralContext* context, AllocationFailureMode mode) const
 {
-    return m_localAllocator->allocate(context, mode);
+    return m_localAllocator->allocate(heap, context, mode);
 }
 
 } // namespace JSC
index 7eee4a8..0cc4b47 100644 (file)
@@ -38,9 +38,8 @@
 
 namespace JSC {
 
-BlockDirectory::BlockDirectory(Heap* heap, size_t cellSize)
+BlockDirectory::BlockDirectory(size_t cellSize)
     : m_cellSize(static_cast<unsigned>(cellSize))
-    , m_heap(heap)
 {
 }
 
@@ -76,7 +75,7 @@ bool BlockDirectory::isPagedOut(MonotonicTime deadline)
 
 MarkedBlock::Handle* BlockDirectory::findEmptyBlockToSteal()
 {
-    m_emptyCursor = m_empty.findBit(m_emptyCursor, true);
+    m_emptyCursor = m_bits.empty().findBit(m_emptyCursor, true);
     if (m_emptyCursor >= m_blocks.size())
         return nullptr;
     return m_blocks[m_emptyCursor];
@@ -85,7 +84,7 @@ MarkedBlock::Handle* BlockDirectory::findEmptyBlockToSteal()
 MarkedBlock::Handle* BlockDirectory::findBlockForAllocation(LocalAllocator& allocator)
 {
     for (;;) {
-        allocator.m_allocationCursor = (m_canAllocateButNotEmpty | m_empty).findBit(allocator.m_allocationCursor, true);
+        allocator.m_allocationCursor = (m_bits.canAllocateButNotEmpty() | m_bits.empty()).findBit(allocator.m_allocationCursor, true);
         if (allocator.m_allocationCursor >= m_blocks.size())
             return nullptr;
         
@@ -96,11 +95,11 @@ MarkedBlock::Handle* BlockDirectory::findBlockForAllocation(LocalAllocator& allo
     }
 }
 
-MarkedBlock::Handle* BlockDirectory::tryAllocateBlock()
+MarkedBlock::Handle* BlockDirectory::tryAllocateBlock(Heap& heap)
 {
     SuperSamplerScope superSamplerScope(false);
     
-    MarkedBlock::Handle* handle = MarkedBlock::tryCreate(*m_heap, subspace()->alignedMemoryAllocator());
+    MarkedBlock::Handle* handle = MarkedBlock::tryCreate(heap, subspace()->alignedMemoryAllocator());
     if (!handle)
         return nullptr;
     
@@ -118,21 +117,12 @@ void BlockDirectory::addBlock(MarkedBlock::Handle* block)
         size_t oldCapacity = m_blocks.capacity();
         m_blocks.append(block);
         if (m_blocks.capacity() != oldCapacity) {
-            forEachBitVector(
-                NoLockingNecessary,
-                [&] (FastBitVector& vector) {
-                    ASSERT_UNUSED(vector, vector.numBits() == oldCapacity);
-                });
-            
+            ASSERT(m_bits.numBits() == oldCapacity);
             ASSERT(m_blocks.capacity() > oldCapacity);
             
             LockHolder locker(m_bitvectorLock);
             subspace()->didResizeBits(m_blocks.capacity());
-            forEachBitVector(
-                locker,
-                [&] (FastBitVector& vector) {
-                    vector.resize(m_blocks.capacity());
-                });
+            m_bits.resize(m_blocks.capacity());
         }
     } else {
         index = m_freeBlockIndices.takeLast();
@@ -142,8 +132,8 @@ void BlockDirectory::addBlock(MarkedBlock::Handle* block)
     
     forEachBitVector(
         NoLockingNecessary,
-        [&] (FastBitVector& vector) {
-            ASSERT_UNUSED(vector, !vector[index]);
+        [&](auto vectorRef) {
+            ASSERT_UNUSED(vectorRef, !vectorRef[index]);
         });
 
     // This is the point at which the block learns of its cellSize() and attributes().
@@ -165,8 +155,8 @@ void BlockDirectory::removeBlock(MarkedBlock::Handle* block)
     
     forEachBitVector(
         holdLock(m_bitvectorLock),
-        [&] (FastBitVector& vector) {
-            vector[block->index()] = false;
+        [&](auto vectorRef) {
+            vectorRef[block->index()] = false;
         });
     
     block->didRemoveFromDirectory();
@@ -192,7 +182,7 @@ void BlockDirectory::prepareForAllocation()
     m_unsweptCursor = 0;
     m_emptyCursor = 0;
     
-    m_eden.clearAll();
+    m_bits.eden().clearAll();
 
     if (UNLIKELY(Options::useImmortalObjects())) {
         // FIXME: Make this work again.
@@ -237,20 +227,20 @@ void BlockDirectory::beginMarkingForFullCollection()
     // Mark bits are sticky and so is our summary of mark bits. We only clear these during full
     // collections, so if you survived the last collection you will survive the next one so long
     // as the next one is eden.
-    m_markingNotEmpty.clearAll();
-    m_markingRetired.clearAll();
+    m_bits.markingNotEmpty().clearAll();
+    m_bits.markingRetired().clearAll();
 }
 
 void BlockDirectory::endMarking()
 {
-    m_allocated.clearAll();
+    m_bits.allocated().clearAll();
     
     // It's surprising and frustrating to comprehend, but the end-of-marking flip does not need to
     // know what kind of collection it is. That knowledge is already encoded in the m_markingXYZ
     // vectors.
     
-    m_empty = m_live & ~m_markingNotEmpty;
-    m_canAllocateButNotEmpty = m_live & m_markingNotEmpty & ~m_markingRetired;
+    m_bits.empty() = m_bits.live() & ~m_bits.markingNotEmpty();
+    m_bits.canAllocateButNotEmpty() = m_bits.live() & m_bits.markingNotEmpty() & ~m_bits.markingRetired();
 
     if (needsDestruction()) {
         // There are some blocks that we didn't allocate out of in the last cycle, but we swept them. This
@@ -258,7 +248,7 @@ void BlockDirectory::endMarking()
         // destructors again. That's fine because of zapping. The only time when we cannot forget is when
         // we just allocate a block or when we move a block from one size class to another. That doesn't
         // happen here.
-        m_destructible = m_live;
+        m_bits.destructible() = m_bits.live();
     }
     
     if (false) {
@@ -269,17 +259,17 @@ void BlockDirectory::endMarking()
 
 void BlockDirectory::snapshotUnsweptForEdenCollection()
 {
-    m_unswept |= m_eden;
+    m_bits.unswept() |= m_bits.eden();
 }
 
 void BlockDirectory::snapshotUnsweptForFullCollection()
 {
-    m_unswept = m_live;
+    m_bits.unswept() = m_bits.live();
 }
 
 MarkedBlock::Handle* BlockDirectory::findBlockToSweep()
 {
-    m_unsweptCursor = m_unswept.findBit(m_unsweptCursor, true);
+    m_unsweptCursor = m_bits.unswept().findBit(m_unsweptCursor, true);
     if (m_unsweptCursor >= m_blocks.size())
         return nullptr;
     return m_blocks[m_unsweptCursor];
@@ -287,7 +277,7 @@ MarkedBlock::Handle* BlockDirectory::findBlockToSweep()
 
 void BlockDirectory::sweep()
 {
-    m_unswept.forEachSetBit(
+    m_bits.unswept().forEachSetBit(
         [&] (size_t index) {
             MarkedBlock::Handle* block = m_blocks[index];
             block->sweep(nullptr);
@@ -296,7 +286,7 @@ void BlockDirectory::sweep()
 
 void BlockDirectory::shrink()
 {
-    (m_empty & ~m_destructible).forEachSetBit(
+    (m_bits.empty() & ~m_bits.destructible()).forEachSetBit(
         [&] (size_t index) {
             markedSpace().freeBlock(m_blocks[index]);
         });
@@ -307,7 +297,7 @@ void BlockDirectory::assertNoUnswept()
     if (ASSERT_DISABLED)
         return;
     
-    if (m_unswept.isEmpty())
+    if (m_bits.unswept().isEmpty())
         return;
     
     dataLog("Assertion failed: unswept not empty in ", *this, ".\n");
@@ -329,7 +319,7 @@ RefPtr<SharedTask<MarkedBlock::Handle*()>> BlockDirectory::parallelNotEmptyBlock
             if (m_done)
                 return nullptr;
             auto locker = holdLock(m_lock);
-            m_index = m_directory.m_markingNotEmpty.findBit(m_index, true);
+            m_index = m_directory.m_bits.markingNotEmpty().findBit(m_index, true);
             if (m_index >= m_directory.m_blocks.size()) {
                 m_done = true;
                 return nullptr;
@@ -357,18 +347,19 @@ void BlockDirectory::dumpBits(PrintStream& out)
     unsigned maxNameLength = 0;
     forEachBitVectorWithName(
         NoLockingNecessary,
-        [&] (FastBitVector&, const char* name) {
+        [&](auto vectorRef, const char* name) {
+            UNUSED_PARAM(vectorRef);
             unsigned length = strlen(name);
             maxNameLength = std::max(maxNameLength, length);
         });
     
     forEachBitVectorWithName(
         NoLockingNecessary,
-        [&] (FastBitVector& vector, const char* name) {
+        [&](auto vectorRef, const char* name) {
             out.print("    ", name, ": ");
             for (unsigned i = maxNameLength - strlen(name); i--;)
                 out.print(" ");
-            out.print(vector, "\n");
+            out.print(vectorRef, "\n");
         });
 }
 
index a7bcaa6..cb72cb1 100644 (file)
@@ -26,6 +26,7 @@
 #pragma once
 
 #include "AllocationFailureMode.h"
+#include "BlockDirectoryBits.h"
 #include "CellAttributes.h"
 #include "FreeList.h"
 #include "LocalAllocator.h"
@@ -44,33 +45,6 @@ class IsoCellSet;
 class MarkedSpace;
 class LLIntOffsetsExtractor;
 
-#define FOR_EACH_BLOCK_DIRECTORY_BIT(macro) \
-    macro(live, Live) /* The set of block indices that have actual blocks. */\
-    macro(empty, Empty) /* The set of all blocks that have no live objects. */ \
-    macro(allocated, Allocated) /* The set of all blocks that are full of live objects. */\
-    macro(canAllocateButNotEmpty, CanAllocateButNotEmpty) /* The set of all blocks are neither empty nor retired (i.e. are more than minMarkedBlockUtilization full). */ \
-    macro(destructible, Destructible) /* The set of all blocks that may have destructors to run. */\
-    macro(eden, Eden) /* The set of all blocks that have new objects since the last GC. */\
-    macro(unswept, Unswept) /* The set of all blocks that could be swept by the incremental sweeper. */\
-    \
-    /* These are computed during marking. */\
-    macro(markingNotEmpty, MarkingNotEmpty) /* The set of all blocks that are not empty. */ \
-    macro(markingRetired, MarkingRetired) /* The set of all blocks that are retired. */
-
-// FIXME: We defined canAllocateButNotEmpty and empty to be exclusive:
-//
-//     canAllocateButNotEmpty & empty == 0
-//
-// Instead of calling it canAllocate and making it inclusive:
-//
-//     canAllocate & empty == empty
-//
-// The latter is probably better. I'll leave it to a future bug to fix that, since breathing on
-// this code leads to regressions for days, and it's not clear that making this change would
-// improve perf since it would not change the collector's behavior, and either way the directory
-// has to look at both bitvectors.
-// https://bugs.webkit.org/show_bug.cgi?id=162121
-
 class BlockDirectory {
     WTF_MAKE_NONCOPYABLE(BlockDirectory);
     WTF_MAKE_FAST_ALLOCATED;
@@ -78,7 +52,7 @@ class BlockDirectory {
     friend class LLIntOffsetsExtractor;
 
 public:
-    BlockDirectory(Heap*, size_t cellSize);
+    BlockDirectory(size_t cellSize);
     ~BlockDirectory();
     void setSubspace(Subspace*);
     void lastChanceToFinalize();
@@ -98,7 +72,6 @@ public:
     bool needsDestruction() const { return m_attributes.destruction == NeedsDestruction; }
     DestructionMode destruction() const { return m_attributes.destruction; }
     HeapCell::Kind cellKind() const { return m_attributes.cellKind; }
-    Heap* heap() { return m_heap; }
 
     bool isFreeListedCell(const void* target);
 
@@ -115,9 +88,9 @@ public:
     Lock& bitvectorLock() { return m_bitvectorLock; }
 
 #define BLOCK_DIRECTORY_BIT_ACCESSORS(lowerBitName, capitalBitName)     \
-    bool is ## capitalBitName(const AbstractLocker&, size_t index) const { return m_ ## lowerBitName[index]; } \
+    bool is ## capitalBitName(const AbstractLocker&, size_t index) const { return m_bits.is ## capitalBitName(index); } \
     bool is ## capitalBitName(const AbstractLocker& locker, MarkedBlock::Handle* block) const { return is ## capitalBitName(locker, block->index()); } \
-    void setIs ## capitalBitName(const AbstractLocker&, size_t index, bool value) { m_ ## lowerBitName[index] = value; } \
+    void setIs ## capitalBitName(const AbstractLocker&, size_t index, bool value) { m_bits.setIs ## capitalBitName(index, value); } \
     void setIs ## capitalBitName(const AbstractLocker& locker, MarkedBlock::Handle* block, bool value) { setIs ## capitalBitName(locker, block->index(), value); }
     FOR_EACH_BLOCK_DIRECTORY_BIT(BLOCK_DIRECTORY_BIT_ACCESSORS)
 #undef BLOCK_DIRECTORY_BIT_ACCESSORS
@@ -126,7 +99,7 @@ public:
     void forEachBitVector(const AbstractLocker&, const Func& func)
     {
 #define BLOCK_DIRECTORY_BIT_CALLBACK(lowerBitName, capitalBitName) \
-        func(m_ ## lowerBitName);
+        func(m_bits.lowerBitName());
         FOR_EACH_BLOCK_DIRECTORY_BIT(BLOCK_DIRECTORY_BIT_CALLBACK);
 #undef BLOCK_DIRECTORY_BIT_CALLBACK
     }
@@ -135,7 +108,7 @@ public:
     void forEachBitVectorWithName(const AbstractLocker&, const Func& func)
     {
 #define BLOCK_DIRECTORY_BIT_CALLBACK(lowerBitName, capitalBitName) \
-        func(m_ ## lowerBitName, #capitalBitName);
+        func(m_bits.lowerBitName(), #capitalBitName);
         FOR_EACH_BLOCK_DIRECTORY_BIT(BLOCK_DIRECTORY_BIT_CALLBACK);
 #undef BLOCK_DIRECTORY_BIT_CALLBACK
     }
@@ -166,17 +139,14 @@ private:
     
     MarkedBlock::Handle* findBlockForAllocation(LocalAllocator&);
     
-    MarkedBlock::Handle* tryAllocateBlock();
+    MarkedBlock::Handle* tryAllocateBlock(Heap&);
     
     Vector<MarkedBlock::Handle*> m_blocks;
     Vector<unsigned> m_freeBlockIndices;
 
     // Mutator uses this to guard resizing the bitvectors. Those things in the GC that may run
     // concurrently to the mutator must lock this when accessing the bitvectors.
-#define BLOCK_DIRECTORY_BIT_DECLARATION(lowerBitName, capitalBitName) \
-    FastBitVector m_ ## lowerBitName;
-    FOR_EACH_BLOCK_DIRECTORY_BIT(BLOCK_DIRECTORY_BIT_DECLARATION)
-#undef BLOCK_DIRECTORY_BIT_DECLARATION
+    BlockDirectoryBits m_bits;
     Lock m_bitvectorLock;
     Lock m_localAllocatorsLock;
     CellAttributes m_attributes;
@@ -190,7 +160,6 @@ private:
     
     // FIXME: All of these should probably be references.
     // https://bugs.webkit.org/show_bug.cgi?id=166988
-    Heap* m_heap { nullptr };
     Subspace* m_subspace { nullptr };
     BlockDirectory* m_nextDirectory { nullptr };
     BlockDirectory* m_nextDirectoryInSubspace { nullptr };
diff --git a/Source/JavaScriptCore/heap/BlockDirectoryBits.h b/Source/JavaScriptCore/heap/BlockDirectoryBits.h
new file mode 100644 (file)
index 0000000..a39b143
--- /dev/null
@@ -0,0 +1,229 @@
+/*
+ * Copyright (C) 2019 Apple Inc. All rights reserved.
+ *
+ * Redistribution and use in source and binary forms, with or without
+ * modification, are permitted provided that the following conditions
+ * are met:
+ * 1. Redistributions of source code must retain the above copyright
+ *    notice, this list of conditions and the following disclaimer.
+ * 2. Redistributions in binary form must reproduce the above copyright
+ *    notice, this list of conditions and the following disclaimer in the
+ *    documentation and/or other materials provided with the distribution.
+ *
+ * THIS SOFTWARE IS PROVIDED BY APPLE INC. ``AS IS'' AND ANY
+ * EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
+ * IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
+ * PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL APPLE INC. OR
+ * CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
+ * EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
+ * PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
+ * PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY
+ * OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+ * (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+ * OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+ */
+
+#pragma once
+
+#include <array>
+#include <wtf/FastBitVector.h>
+#include <wtf/Vector.h>
+
+namespace JSC {
+
+#define FOR_EACH_BLOCK_DIRECTORY_BIT(macro) \
+    macro(live, Live) /* The set of block indices that have actual blocks. */\
+    macro(empty, Empty) /* The set of all blocks that have no live objects. */ \
+    macro(allocated, Allocated) /* The set of all blocks that are full of live objects. */\
+    macro(canAllocateButNotEmpty, CanAllocateButNotEmpty) /* The set of all blocks are neither empty nor retired (i.e. are more than minMarkedBlockUtilization full). */ \
+    macro(destructible, Destructible) /* The set of all blocks that may have destructors to run. */\
+    macro(eden, Eden) /* The set of all blocks that have new objects since the last GC. */\
+    macro(unswept, Unswept) /* The set of all blocks that could be swept by the incremental sweeper. */\
+    \
+    /* These are computed during marking. */\
+    macro(markingNotEmpty, MarkingNotEmpty) /* The set of all blocks that are not empty. */ \
+    macro(markingRetired, MarkingRetired) /* The set of all blocks that are retired. */
+
+// FIXME: We defined canAllocateButNotEmpty and empty to be exclusive:
+//
+//     canAllocateButNotEmpty & empty == 0
+//
+// Instead of calling it canAllocate and making it inclusive:
+//
+//     canAllocate & empty == empty
+//
+// The latter is probably better. I'll leave it to a future bug to fix that, since breathing on
+// this code leads to regressions for days, and it's not clear that making this change would
+// improve perf since it would not change the collector's behavior, and either way the directory
+// has to look at both bitvectors.
+// https://bugs.webkit.org/show_bug.cgi?id=162121
+
+class BlockDirectoryBits {
+    WTF_MAKE_FAST_ALLOCATED;
+public:
+    static constexpr unsigned bitsPerSegment = 32;
+    static constexpr unsigned segmentShift = 5;
+    static constexpr unsigned indexMask = (1U << segmentShift) - 1;
+    static_assert((1 << segmentShift) == bitsPerSegment);
+
+#define BLOCK_DIRECTORY_BIT_KIND_COUNT(lowerBitName, capitalBitName) + 1
+    static constexpr unsigned numberOfBlockDirectoryBitKinds = 0 FOR_EACH_BLOCK_DIRECTORY_BIT(BLOCK_DIRECTORY_BIT_KIND_COUNT);
+#undef BLOCK_DIRECTORY_BIT_KIND_COUNT
+
+    enum class Kind {
+#define BLOCK_DIRECTORY_BIT_KIND_DECLARATION(lowerBitName, capitalBitName) \
+        capitalBitName,
+        FOR_EACH_BLOCK_DIRECTORY_BIT(BLOCK_DIRECTORY_BIT_KIND_DECLARATION)
+#undef BLOCK_DIRECTORY_BIT_KIND_DECLARATION
+    };
+
+    class Segment {
+    public:
+        Segment() = default;
+        std::array<uint32_t, numberOfBlockDirectoryBitKinds> m_data { };
+    };
+
+    template<Kind kind>
+    class BlockDirectoryBitVectorWordView {
+        WTF_MAKE_FAST_ALLOCATED;
+    public:
+        using ViewType = BlockDirectoryBitVectorWordView;
+
+        BlockDirectoryBitVectorWordView() = default;
+
+        BlockDirectoryBitVectorWordView(const Segment* segments, size_t numBits)
+            : m_segments(segments)
+            , m_numBits(numBits)
+        {
+        }
+
+        size_t numBits() const
+        {
+            return m_numBits;
+        }
+
+        uint32_t word(size_t index) const
+        {
+            ASSERT(index < WTF::fastBitVectorArrayLength(numBits()));
+            return m_segments[index].m_data[static_cast<unsigned>(kind)];
+        }
+
+        uint32_t& word(size_t index)
+        {
+            ASSERT(index < WTF::fastBitVectorArrayLength(numBits()));
+            return const_cast<Segment*>(m_segments)[index].m_data[static_cast<unsigned>(kind)];
+        }
+
+        void clearAll()
+        {
+            for (size_t index = 0; index < WTF::fastBitVectorArrayLength(numBits()); ++index)
+                const_cast<Segment*>(m_segments)[index].m_data[static_cast<unsigned>(kind)] = 0;
+        }
+
+        BlockDirectoryBitVectorWordView view() const { return *this; }
+
+    private:
+        const Segment* m_segments { nullptr };
+        size_t m_numBits { 0 };
+    };
+
+    template<Kind kind>
+    using BlockDirectoryBitVectorView = WTF::FastBitVectorImpl<BlockDirectoryBitVectorWordView<kind>>;
+
+    template<Kind kind>
+    class BlockDirectoryBitVectorRef final : public BlockDirectoryBitVectorView<kind> {
+    public:
+        using Base = BlockDirectoryBitVectorView<kind>;
+
+        explicit BlockDirectoryBitVectorRef(BlockDirectoryBitVectorWordView<kind> view)
+            : Base(view)
+        {
+        }
+
+        template<typename OtherWords>
+        BlockDirectoryBitVectorRef& operator=(const WTF::FastBitVectorImpl<OtherWords>& other)
+        {
+            ASSERT(Base::numBits() == other.numBits());
+            for (unsigned i = Base::arrayLength(); i--;)
+                Base::unsafeWords().word(i) = other.unsafeWords().word(i);
+            return *this;
+        }
+
+        template<typename OtherWords>
+        BlockDirectoryBitVectorRef& operator|=(const WTF::FastBitVectorImpl<OtherWords>& other)
+        {
+            ASSERT(Base::numBits() == other.numBits());
+            for (unsigned i = Base::arrayLength(); i--;)
+                Base::unsafeWords().word(i) |= other.unsafeWords().word(i);
+            return *this;
+        }
+
+        void clearAll()
+        {
+            Base::unsafeWords().clearAll();
+        }
+
+        WTF::FastBitReference at(size_t index)
+        {
+            ASSERT(index < Base::numBits());
+            return WTF::FastBitReference(&Base::unsafeWords().word(index >> 5), 1 << (index & 31));
+        }
+
+        WTF::FastBitReference operator[](size_t index)
+        {
+            return at(index);
+        }
+    };
+
+#define BLOCK_DIRECTORY_BIT_ACCESSORS(lowerBitName, capitalBitName)     \
+    bool is ## capitalBitName(size_t index) const \
+    { \
+        return lowerBitName()[index]; \
+    } \
+    void setIs ## capitalBitName(size_t index, bool value) \
+    { \
+        lowerBitName()[index] = value; \
+    } \
+    BlockDirectoryBitVectorView<Kind::capitalBitName> lowerBitName() const \
+    { \
+        return BlockDirectoryBitVectorView<Kind::capitalBitName>(BlockDirectoryBitVectorWordView<Kind::capitalBitName>(m_segments.data(), m_numBits)); \
+    } \
+    BlockDirectoryBitVectorRef<Kind::capitalBitName> lowerBitName() \
+    { \
+        return BlockDirectoryBitVectorRef<Kind::capitalBitName>(BlockDirectoryBitVectorWordView<Kind::capitalBitName>(m_segments.data(), m_numBits)); \
+    }
+    FOR_EACH_BLOCK_DIRECTORY_BIT(BLOCK_DIRECTORY_BIT_ACCESSORS)
+#undef BLOCK_DIRECTORY_BIT_ACCESSORS
+
+    unsigned numBits() const { return m_numBits; }
+
+    void resize(unsigned numBits)
+    {
+        unsigned oldNumBits = m_numBits;
+        m_numBits = numBits;
+        m_segments.resize(WTF::fastBitVectorArrayLength(numBits));
+        unsigned usedBitsInLastSegment = numBits & indexMask; // This is 0 if all bits are used.
+        if (numBits < oldNumBits && usedBitsInLastSegment) {
+            // Clear the last segment.
+            ASSERT(usedBitsInLastSegment < bitsPerSegment);
+            auto& segment = m_segments.last();
+            uint32_t mask = (1U << usedBitsInLastSegment) - 1;
+            for (unsigned index = 0; index < numberOfBlockDirectoryBitKinds; ++index)
+                segment.m_data[index] &= mask;
+        }
+    }
+
+    template<typename Func>
+    void forEachSegment(const Func& func)
+    {
+        unsigned index = 0;
+        for (auto& segment : m_segments)
+            func(index++, segment);
+    }
+
+private:
+    Vector<Segment> m_segments;
+    unsigned m_numBits { 0 };
+};
+
+} // namespace JSC
index 41cbb49..684b1a8 100644 (file)
@@ -33,7 +33,7 @@ namespace JSC {
 
 template <typename Functor> inline void BlockDirectory::forEachBlock(const Functor& functor)
 {
-    m_live.forEachSetBit(
+    m_bits.live().forEachSetBit(
         [&] (size_t index) {
             functor(m_blocks[index]);
         });
@@ -41,7 +41,7 @@ template <typename Functor> inline void BlockDirectory::forEachBlock(const Funct
 
 template <typename Functor> inline void BlockDirectory::forEachNotEmptyBlock(const Functor& functor)
 {
-    m_markingNotEmpty.forEachSetBit(
+    m_bits.markingNotEmpty().forEachSetBit(
         [&] (size_t index) {
             functor(m_blocks[index]);
         });
index 2cb952f..e4dc623 100644 (file)
@@ -32,6 +32,7 @@
 #include "JSCInlines.h"
 #include "LocalAllocatorInlines.h"
 #include "MarkedBlockInlines.h"
+#include "MarkedSpaceInlines.h"
 #include "PreventCollectionScope.h"
 #include "SubspaceInlines.h"
 
@@ -79,7 +80,7 @@ Allocator CompleteSubspace::allocatorForSlow(size_t size)
     if (false)
         dataLog("Creating BlockDirectory/LocalAllocator for ", m_name, ", ", attributes(), ", ", sizeClass, ".\n");
     
-    std::unique_ptr<BlockDirectory> uniqueDirectory = makeUnique<BlockDirectory>(m_space.heap(), sizeClass);
+    std::unique_ptr<BlockDirectory> uniqueDirectory = makeUnique<BlockDirectory>(sizeClass);
     BlockDirectory* directory = uniqueDirectory.get();
     m_directories.append(WTFMove(uniqueDirectory));
     
@@ -105,7 +106,7 @@ Allocator CompleteSubspace::allocatorForSlow(size_t size)
     }
     
     directory->setNextDirectoryInSubspace(m_firstDirectory);
-    m_alignedMemoryAllocator->registerDirectory(directory);
+    m_alignedMemoryAllocator->registerDirectory(m_space.heap(), directory);
     WTF::storeStoreFence();
     m_firstDirectory = directory;
     return allocator;
@@ -127,7 +128,7 @@ void* CompleteSubspace::tryAllocateSlow(VM& vm, size_t size, GCDeferralContext*
     sanitizeStackForVM(vm);
     
     if (Allocator allocator = allocatorFor(size, AllocatorForMode::EnsureAllocator))
-        return allocator.allocate(deferralContext, AllocationFailureMode::ReturnNull);
+        return allocator.allocate(vm.heap, deferralContext, AllocationFailureMode::ReturnNull);
     
     if (size <= Options::preciseAllocationCutoff()
         && size <= MarkedSpace::largeCutoff) {
index eace2a6..871df2b 100644 (file)
@@ -36,7 +36,7 @@ ALWAYS_INLINE void* CompleteSubspace::allocateNonVirtual(VM& vm, size_t size, GC
         RELEASE_ASSERT(vm.heap.expectDoesGC());
 
     if (Allocator allocator = allocatorForNonVirtual(size, AllocatorForMode::AllocatorIfExists))
-        return allocator.allocate(deferralContext, failureMode);
+        return allocator.allocate(vm.heap, deferralContext, failureMode);
     return allocateSlow(vm, size, deferralContext, failureMode);
 }
 
index aa62e11..80a9ceb 100644 (file)
@@ -61,7 +61,7 @@ Ref<SharedTask<MarkedBlock::Handle*()>> IsoCellSet::parallelNotEmptyMarkedBlockS
             if (m_done)
                 return nullptr;
             auto locker = holdLock(m_lock);
-            auto bits = m_directory.m_markingNotEmpty & m_set.m_blocksWithBits;
+            auto bits = m_directory.m_bits.markingNotEmpty() & m_set.m_blocksWithBits;
             m_index = bits.findBit(m_index, true);
             if (m_index >= m_directory.m_blocks.size()) {
                 m_done = true;
index fc81692..db7e8ac 100644 (file)
@@ -70,7 +70,7 @@ template<typename Func>
 void IsoCellSet::forEachMarkedCell(const Func& func)
 {
     BlockDirectory& directory = m_subspace.m_directory;
-    (directory.m_markingNotEmpty & m_blocksWithBits).forEachSetBit(
+    (directory.m_bits.markingNotEmpty() & m_blocksWithBits).forEachSetBit(
         [&] (size_t blockIndex) {
             MarkedBlock::Handle* block = directory.m_blocks[blockIndex];
 
index 453e602..d1cf61f 100644 (file)
 #include "IsoCellSetInlines.h"
 #include "IsoSubspaceInlines.h"
 #include "LocalAllocatorInlines.h"
+#include "MarkedSpaceInlines.h"
 
 namespace JSC {
 
 IsoSubspace::IsoSubspace(CString name, Heap& heap, HeapCellType* heapCellType, size_t size, uint8_t numberOfLowerTierCells)
     : Subspace(name, heap)
-    , m_size(size)
-    , m_directory(&heap, WTF::roundUpToMultipleOf<MarkedBlock::atomSize>(size))
+    , m_directory(WTF::roundUpToMultipleOf<MarkedBlock::atomSize>(size))
     , m_localAllocator(&m_directory)
     , m_isoAlignedMemoryAllocator(makeUnique<IsoAlignedMemoryAllocator>())
-    , m_remainingLowerTierCellCount(numberOfLowerTierCells)
 {
+    m_remainingLowerTierCellCount = numberOfLowerTierCells;
+    ASSERT(WTF::roundUpToMultipleOf<MarkedBlock::atomSize>(size) == cellSize());
     ASSERT(numberOfLowerTierCells <= MarkedBlock::maxNumberOfLowerTierCells);
     m_isIsoSubspace = true;
     initialize(heapCellType, m_isoAlignedMemoryAllocator.get());
@@ -50,7 +51,7 @@ IsoSubspace::IsoSubspace(CString name, Heap& heap, HeapCellType* heapCellType, s
     auto locker = holdLock(m_space.directoryLock());
     m_directory.setSubspace(this);
     m_space.addBlockDirectory(locker, &m_directory);
-    m_alignedMemoryAllocator->registerDirectory(&m_directory);
+    m_alignedMemoryAllocator->registerDirectory(heap, &m_directory);
     m_firstDirectory = &m_directory;
 }
 
@@ -111,8 +112,7 @@ void* IsoSubspace::tryAllocateFromLowerTier()
         return revive(allocation);
     }
     if (m_remainingLowerTierCellCount) {
-        size_t size = WTF::roundUpToMultipleOf<MarkedSpace::sizeStep>(m_size);
-        PreciseAllocation* allocation = PreciseAllocation::createForLowerTier(*m_space.heap(), size, this, --m_remainingLowerTierCellCount);
+        PreciseAllocation* allocation = PreciseAllocation::createForLowerTier(m_space.heap(), cellSize(), this, --m_remainingLowerTierCellCount);
         return revive(allocation);
     }
     return nullptr;
index ca777d0..a5143bd 100644 (file)
@@ -40,7 +40,7 @@ public:
     JS_EXPORT_PRIVATE IsoSubspace(CString name, Heap&, HeapCellType*, size_t size, uint8_t numberOfLowerTierCells);
     JS_EXPORT_PRIVATE ~IsoSubspace();
 
-    size_t size() const { return m_size; }
+    size_t cellSize() { return m_directory.cellSize(); }
 
     Allocator allocatorFor(size_t, AllocatorForMode) override;
     Allocator allocatorForNonVirtual(size_t, AllocatorForMode);
@@ -63,18 +63,16 @@ private:
     void didRemoveBlock(size_t blockIndex) override;
     void didBeginSweepingToFreeList(MarkedBlock::Handle*) override;
     
-    size_t m_size;
     BlockDirectory m_directory;
     LocalAllocator m_localAllocator;
     std::unique_ptr<IsoAlignedMemoryAllocator> m_isoAlignedMemoryAllocator;
     SentinelLinkedList<PreciseAllocation, PackedRawSentinelNode<PreciseAllocation>> m_lowerTierFreeList;
     SentinelLinkedList<IsoCellSet, PackedRawSentinelNode<IsoCellSet>> m_cellSets;
-    uint8_t m_remainingLowerTierCellCount { 0 };
 };
 
 ALWAYS_INLINE Allocator IsoSubspace::allocatorForNonVirtual(size_t size, AllocatorForMode)
 {
-    RELEASE_ASSERT(size == this->size());
+    RELEASE_ASSERT(WTF::roundUpToMultipleOf<MarkedBlock::atomSize>(size) == cellSize());
     return Allocator(&m_localAllocator);
 }
 
index 0518a7f..4660e77 100644 (file)
 
 namespace JSC {
 
-ALWAYS_INLINE void* IsoSubspace::allocateNonVirtual(VM&, size_t size, GCDeferralContext* deferralContext, AllocationFailureMode failureMode)
+ALWAYS_INLINE void* IsoSubspace::allocateNonVirtual(VM& vm, size_t size, GCDeferralContext* deferralContext, AllocationFailureMode failureMode)
 {
-    RELEASE_ASSERT(size == this->size());
+    RELEASE_ASSERT(WTF::roundUpToMultipleOf<MarkedBlock::atomSize>(size) == cellSize());
     Allocator allocator = allocatorForNonVirtual(size, AllocatorForMode::MustAlreadyHaveAllocator);
-    void* result = allocator.allocate(deferralContext, failureMode);
+    void* result = allocator.allocate(vm.heap, deferralContext, failureMode);
     return result;
 }
 
index b9e1850..b2f6bbd 100644 (file)
@@ -41,7 +41,7 @@ public:
     ~AutoremovingIsoSubspace()
     {
         auto locker = holdLock(m_perVM.m_lock);
-        m_perVM.m_subspacePerVM.remove(&space().heap()->vm());
+        m_perVM.m_subspacePerVM.remove(&space().heap().vm());
     }
 
 private:
index 53e0508..f55571c 100644 (file)
@@ -110,12 +110,11 @@ void LocalAllocator::stopAllocatingForGood()
     reset();
 }
 
-void* LocalAllocator::allocateSlowCase(GCDeferralContext* deferralContext, AllocationFailureMode failureMode)
+void* LocalAllocator::allocateSlowCase(Heap& heap, GCDeferralContext* deferralContext, AllocationFailureMode failureMode)
 {
     SuperSamplerScope superSamplerScope(false);
-    Heap& heap = *m_directory->m_heap;
     ASSERT(heap.vm().currentThreadIsHoldingAPILock());
-    doTestCollectionsIfNeeded(deferralContext);
+    doTestCollectionsIfNeeded(heap, deferralContext);
 
     ASSERT(!m_directory->markedSpace().isIterating());
     heap.didAllocate(m_freeList.originalSize());
@@ -129,7 +128,7 @@ void* LocalAllocator::allocateSlowCase(GCDeferralContext* deferralContext, Alloc
     // Goofy corner case: the GC called a callback and now this directory has a currentBlock. This only
     // happens when running WebKit tests, which inject a callback into the GC's finalization.
     if (UNLIKELY(m_currentBlock))
-        return allocate(deferralContext, failureMode);
+        return allocate(heap, deferralContext, failureMode);
     
     void* result = tryAllocateWithoutCollecting();
     
@@ -142,7 +141,7 @@ void* LocalAllocator::allocateSlowCase(GCDeferralContext* deferralContext, Alloc
             return result;
     }
     
-    MarkedBlock::Handle* block = m_directory->tryAllocateBlock();
+    MarkedBlock::Handle* block = m_directory->tryAllocateBlock(heap);
     if (!block) {
         if (failureMode == AllocationFailureMode::Assert)
             RELEASE_ASSERT_NOT_REACHED();
@@ -249,18 +248,18 @@ void* LocalAllocator::tryAllocateIn(MarkedBlock::Handle* block)
     return result;
 }
 
-void LocalAllocator::doTestCollectionsIfNeeded(GCDeferralContext* deferralContext)
+void LocalAllocator::doTestCollectionsIfNeeded(Heap& heap, GCDeferralContext* deferralContext)
 {
     if (!Options::slowPathAllocsBetweenGCs())
         return;
 
     static unsigned allocationCount = 0;
     if (!allocationCount) {
-        if (!m_directory->m_heap->isDeferred()) {
+        if (!heap.isDeferred()) {
             if (deferralContext)
                 deferralContext->m_shouldGC = true;
             else
-                m_directory->m_heap->collectNow(Sync, CollectionScope::Full);
+                heap.collectNow(Sync, CollectionScope::Full);
         }
     }
     if (++allocationCount >= Options::slowPathAllocsBetweenGCs())
index 437273f..9b651b0 100644 (file)
@@ -34,6 +34,7 @@ namespace JSC {
 
 class BlockDirectory;
 class GCDeferralContext;
+class Heap;
 
 class LocalAllocator : public BasicRawSentinelNode<LocalAllocator> {
     WTF_MAKE_NONCOPYABLE(LocalAllocator);
@@ -42,7 +43,7 @@ public:
     LocalAllocator(BlockDirectory*);
     ~LocalAllocator();
     
-    void* allocate(GCDeferralContext*, AllocationFailureMode);
+    void* allocate(Heap&, GCDeferralContext*, AllocationFailureMode);
     
     unsigned cellSize() const { return m_freeList.cellSize(); }
 
@@ -60,12 +61,12 @@ private:
     friend class BlockDirectory;
     
     void reset();
-    JS_EXPORT_PRIVATE void* allocateSlowCase(GCDeferralContext*, AllocationFailureMode failureMode);
+    JS_EXPORT_PRIVATE void* allocateSlowCase(Heap&, GCDeferralContext*, AllocationFailureMode);
     void didConsumeFreeList();
     void* tryAllocateWithoutCollecting();
     void* tryAllocateIn(MarkedBlock::Handle*);
     void* allocateIn(MarkedBlock::Handle*);
-    ALWAYS_INLINE void doTestCollectionsIfNeeded(GCDeferralContext*);
+    ALWAYS_INLINE void doTestCollectionsIfNeeded(Heap&, GCDeferralContext*);
 
     BlockDirectory* m_directory;
     FreeList m_freeList;
index 00d5574..c3c8695 100644 (file)
 
 namespace JSC {
 
-ALWAYS_INLINE void* LocalAllocator::allocate(GCDeferralContext* deferralContext, AllocationFailureMode failureMode)
+ALWAYS_INLINE void* LocalAllocator::allocate(Heap& heap, GCDeferralContext* deferralContext, AllocationFailureMode failureMode)
 {
     if (validateDFGDoesGC)
-        RELEASE_ASSERT(m_directory->heap()->expectDoesGC());
+        RELEASE_ASSERT(heap.expectDoesGC());
     return m_freeList.allocate(
         [&] () -> HeapCell* {
-            sanitizeStackForVM(m_directory->heap()->vm());
-            return static_cast<HeapCell*>(allocateSlowCase(deferralContext, failureMode));
+            sanitizeStackForVM(heap.vm());
+            return static_cast<HeapCell*>(allocateSlowCase(heap, deferralContext, failureMode));
         });
 }
 
index c6e1f6c..42504a0 100644 (file)
@@ -377,8 +377,8 @@ void MarkedBlock::Handle::dumpState(PrintStream& out)
     CommaPrinter comma;
     directory()->forEachBitVectorWithName(
         holdLock(directory()->bitvectorLock()),
-        [&] (FastBitVector& bitvector, const char* name) {
-            out.print(comma, name, ":", bitvector[index()] ? "YES" : "no");
+        [&](auto vectorRef, const char* name) {
+            out.print(comma, name, ":", vectorRef[index()] ? "YES" : "no");
         });
 }
 
index 9703e35..d408d6e 100644 (file)
@@ -27,6 +27,7 @@
 #include "JSObject.h"
 #include "JSCInlines.h"
 #include "MarkedBlockInlines.h"
+#include "MarkedSpaceInlines.h"
 #include <wtf/ListDump.h>
 
 namespace JSC {
@@ -195,8 +196,8 @@ void MarkedSpace::initializeSizeClassForStepSize()
 }
 
 MarkedSpace::MarkedSpace(Heap* heap)
-    : m_heap(heap)
 {
+    ASSERT_UNUSED(heap, heap == &this->heap());
     initializeSizeClassForStepSize();
 }
 
@@ -234,7 +235,7 @@ void MarkedSpace::lastChanceToFinalize()
 
 void MarkedSpace::sweepBlocks()
 {
-    m_heap->sweeper().stopSweeping();
+    heap().sweeper().stopSweeping();
     forEachDirectory(
         [&] (BlockDirectory& directory) -> IterationStatus {
             directory.sweep();
@@ -270,13 +271,13 @@ void MarkedSpace::sweepPreciseAllocations()
 
 void MarkedSpace::prepareForAllocation()
 {
-    ASSERT(!Thread::mayBeGCThread() || m_heap->worldIsStopped());
+    ASSERT(!Thread::mayBeGCThread() || heap().worldIsStopped());
     for (Subspace* subspace : m_subspaces)
         subspace->prepareForAllocation();
 
     m_activeWeakSets.takeFrom(m_newActiveWeakSets);
     
-    if (m_heap->collectionScope() == CollectionScope::Eden)
+    if (heap().collectionScope() == CollectionScope::Eden)
         m_preciseAllocationsNurseryOffsetForSweep = m_preciseAllocationsNurseryOffset;
     else
         m_preciseAllocationsNurseryOffsetForSweep = 0;
@@ -298,7 +299,7 @@ void MarkedSpace::visitWeakSets(SlotVisitor& visitor)
     
     m_newActiveWeakSets.forEach(visit);
     
-    if (m_heap->collectionScope() == CollectionScope::Full)
+    if (heap().collectionScope() == CollectionScope::Full)
         m_activeWeakSets.forEach(visit);
 }
 
@@ -310,7 +311,7 @@ void MarkedSpace::reapWeakSets()
     
     m_newActiveWeakSets.forEach(visit);
     
-    if (m_heap->collectionScope() == CollectionScope::Full)
+    if (heap().collectionScope() == CollectionScope::Full)
         m_activeWeakSets.forEach(visit);
 }
 
@@ -356,7 +357,7 @@ void MarkedSpace::prepareForConservativeScan()
 
 void MarkedSpace::prepareForMarking()
 {
-    if (m_heap->collectionScope() == CollectionScope::Eden)
+    if (heap().collectionScope() == CollectionScope::Eden)
         m_preciseAllocationsOffsetForThisCollection = m_preciseAllocationsNurseryOffset;
     else
         m_preciseAllocationsOffsetForThisCollection = 0;
@@ -416,7 +417,7 @@ void MarkedSpace::shrink()
 
 void MarkedSpace::beginMarking()
 {
-    if (m_heap->collectionScope() == CollectionScope::Full) {
+    if (heap().collectionScope() == CollectionScope::Full) {
         forEachDirectory(
             [&] (BlockDirectory& directory) -> IterationStatus {
                 directory.beginMarkingForFullCollection();
@@ -551,7 +552,7 @@ void MarkedSpace::didAllocateInBlock(MarkedBlock::Handle* block)
 
 void MarkedSpace::snapshotUnswept()
 {
-    if (m_heap->collectionScope() == CollectionScope::Eden) {
+    if (heap().collectionScope() == CollectionScope::Eden) {
         forEachDirectory(
             [&] (BlockDirectory& directory) -> IterationStatus {
                 directory.snapshotUnsweptForEdenCollection();
index 006805b..ad36622 100644 (file)
@@ -95,7 +95,7 @@ public:
     MarkedSpace(Heap*);
     ~MarkedSpace();
     
-    Heap* heap() const { return m_heap; }
+    Heap& heap() const;
     
     void lastChanceToFinalize(); // Must call stopAllocatingForGood first.
     void freeMemory();
@@ -213,7 +213,6 @@ private:
     PreciseAllocation** m_preciseAllocationsForThisCollectionBegin { nullptr };
     PreciseAllocation** m_preciseAllocationsForThisCollectionEnd { nullptr };
 
-    Heap* m_heap;
     size_t m_capacity { 0 };
     HeapVersion m_markingVersion { initialVersion };
     HeapVersion m_newlyAllocatedVersion { initialVersion };
index df8531d..ea85146 100644 (file)
 
 namespace JSC {
 
+ALWAYS_INLINE Heap& MarkedSpace::heap() const
+{
+    return *bitwise_cast<Heap*>(bitwise_cast<uintptr_t>(this) - OBJECT_OFFSETOF(Heap, m_objectSpace));
+}
+
 template<typename Functor> inline void MarkedSpace::forEachLiveCell(HeapIterationScope&, const Functor& functor)
 {
     ASSERT(isIterating());
index 94c2ff4..ab99e2d 100644 (file)
@@ -31,6 +31,7 @@
 #include "HeapCellType.h"
 #include "JSCInlines.h"
 #include "MarkedBlockInlines.h"
+#include "MarkedSpaceInlines.h"
 #include "ParallelSourceAdapter.h"
 #include "PreventCollectionScope.h"
 #include "SubspaceInlines.h"
@@ -49,7 +50,7 @@ void Subspace::initialize(HeapCellType* heapCellType, AlignedMemoryAllocator* al
     m_alignedMemoryAllocator = alignedMemoryAllocator;
     m_directoryForEmptyAllocation = m_alignedMemoryAllocator->firstDirectory();
 
-    Heap& heap = *m_space.heap();
+    Heap& heap = m_space.heap();
     heap.objectSpace().m_subspaces.append(this);
     m_alignedMemoryAllocator->registerSubspace(this);
 }
index b508114..34ea341 100644 (file)
@@ -119,6 +119,8 @@ protected:
     CString m_name;
 
     bool m_isIsoSubspace { false };
+protected:
+    uint8_t m_remainingLowerTierCellCount { 0 };
 };
 
 } // namespace JSC
index 55e7951..e98d7d9 100644 (file)
@@ -1,3 +1,23 @@
+2019-11-14  Yusuke Suzuki  <ysuzuki@apple.com>
+
+        [JSC] BlockDirectory's bits should be compact
+        https://bugs.webkit.org/show_bug.cgi?id=204149
+
+        Reviewed by Robin Morisset.
+
+        * wtf/FastBitVector.h:
+        (WTF::fastBitVectorArrayLength):
+        (WTF::FastBitVectorImpl::unsafeWords):
+        (WTF::FastBitVectorImpl::unsafeWords const):
+        (WTF::FastBitReference::FastBitReference):
+        (WTF::FastBitReference::operator bool const):
+        (WTF::FastBitReference::operator=):
+        (WTF::FastBitVector::at):
+        (WTF::FastBitVector::operator[]):
+        (WTF::FastBitVector::BitReference::BitReference): Deleted.
+        (WTF::FastBitVector::BitReference::operator bool const): Deleted.
+        (WTF::FastBitVector::BitReference::operator=): Deleted.
+
 2019-11-11  Ross Kirsling  <ross.kirsling@sony.com>
 
         UTC offset for Samoa is miscalculated when !HAVE(TIMEGM)
index a8ed6b8..885b7b1 100644 (file)
@@ -35,7 +35,7 @@ namespace WTF {
 
 class PrintStream;
 
-inline size_t fastBitVectorArrayLength(size_t numBits) { return (numBits + 31) / 32; }
+inline constexpr size_t fastBitVectorArrayLength(size_t numBits) { return (numBits + 31) / 32; }
 
 class FastBitVectorWordView {
     WTF_MAKE_FAST_ALLOCATED;
@@ -421,6 +421,9 @@ public:
     }
     
     typename Words::ViewType wordView() const { return m_words.view(); }
+
+    Words& unsafeWords() { return m_words; }
+    const Words& unsafeWords() const { return m_words; }
     
 private:
     // You'd think that we could remove this friend if we used protected, but you'd be wrong,
@@ -436,6 +439,38 @@ private:
     Words m_words;
 };
 
+class FastBitReference {
+    WTF_MAKE_FAST_ALLOCATED;
+public:
+    FastBitReference() = default;
+
+    FastBitReference(uint32_t* word, uint32_t mask)
+        : m_word(word)
+        , m_mask(mask)
+    {
+    }
+
+    explicit operator bool() const
+    {
+        return !!(*m_word & m_mask);
+    }
+
+    FastBitReference& operator=(bool value)
+    {
+        if (value)
+            *m_word |= m_mask;
+        else
+            *m_word &= ~m_mask;
+        return *this;
+    }
+
+private:
+    uint32_t* m_word { nullptr };
+    uint32_t m_mask { 0 };
+};
+
+
+
 class FastBitVector : public FastBitVectorImpl<FastBitVectorWordOwner> {
 public:
     FastBitVector() { }
@@ -518,42 +553,13 @@ public:
         return atImpl(index);
     }
     
-    class BitReference {
-    public:
-        BitReference() { }
-        
-        BitReference(uint32_t* word, uint32_t mask)
-            : m_word(word)
-            , m_mask(mask)
-        {
-        }
-        
-        explicit operator bool() const
-        {
-            return !!(*m_word & m_mask);
-        }
-        
-        BitReference& operator=(bool value)
-        {
-            if (value)
-                *m_word |= m_mask;
-            else
-                *m_word &= ~m_mask;
-            return *this;
-        }
-        
-    private:
-        uint32_t* m_word { nullptr };
-        uint32_t m_mask { 0 };
-    };
-    
-    BitReference at(size_t index)
+    FastBitReference at(size_t index)
     {
         ASSERT_WITH_SECURITY_IMPLICATION(index < numBits());
-        return BitReference(&m_words.word(index >> 5), 1 << (index & 31));
+        return FastBitReference(&m_words.word(index >> 5), 1 << (index & 31));
     }
     
-    BitReference operator[](size_t index)
+    FastBitReference operator[](size_t index)
     {
         return at(index);
     }