Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[NIT-2570] Avoids fetching batch in execution layer, consensus layer fills all necessary information regarding a batch to the execution layer #2377

Merged
merged 8 commits into from
Jul 29, 2024

Conversation

diegoximenes
Copy link
Contributor

@diegoximenes diegoximenes commented Jun 10, 2024

Avoids fetching batch in execution layer, consensus layer fills all necessary information regarding a batch to the execution layer

@cla-bot cla-bot bot added the s Automatically added by the CLA bot if the creator of a PR is registered as having signed the CLA. label Jun 10, 2024
@diegoximenes diegoximenes changed the title [NIT-2570] Avoids fetching batch in execution, consensus layer fills all necessary information regarding a batch to the execution layer [NIT-2570] Avoids fetching batch in execution layer, consensus layer fills all necessary information regarding a batch to the execution layer Jun 10, 2024
@diegoximenes diegoximenes marked this pull request as ready for review June 10, 2024 23:21
@@ -288,8 +288,7 @@ func (s *ExecutionEngine) resequenceReorgedMessages(messages []*arbostypes.Messa
log.Warn("skipping non-standard sequencer message found from reorg", "header", header)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I want Lee's take on this,
But I think around line 275, when resequencing delayed messages, we should check if the delayed this is a BatchPostingReport message, and if so skip it and not resequence.

Copy link
Contributor Author

@diegoximenes diegoximenes Jun 18, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As discussed in a meeting, this behavior will not be changed in this PR.
Maybe this will be revisited in a future PR.

@@ -544,6 +544,18 @@ func (s *TransactionStreamer) GetMessage(seqNum arbutil.MessageIndex) (*arbostyp
return nil, err
}

err = message.Message.FillInBatchGasCost(func(batchNum uint64) ([]byte, error) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We want to make absolutley sure that in every way we can get a message to consensus client, it will always have the batchGasCost Filled.

I think there are other cases we want to cover, specifically:
legacyGetDelayedMessageAndAccumulator
GetDelayedMessageAccumulatorAndParentChainBlockNumber

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Covered those other cases.

We don't have tests covering L1MessageType_BatchPostingReport today, right?

@diegoximenes diegoximenes requested a review from tsahee June 18, 2024 12:47
Copy link
Contributor

@tsahee tsahee left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@tsahee tsahee enabled auto-merge July 29, 2024 22:13
@tsahee tsahee merged commit 569ec66 into master Jul 29, 2024
13 checks passed
@tsahee tsahee deleted the rm_fetch_batch branch July 29, 2024 22:56
This pull request was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
design-approved s Automatically added by the CLA bot if the creator of a PR is registered as having signed the CLA.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants