Just from looking at the stack trace, does --no-turbo-load-elimination 
resolve the issue? If so, I wonder if the pass could just abandon it's 
efforts in the presence of a large number of elements. I don't suppose 
you've tried reducing your case until the performance is acceptable? This 
would help gauge where the quadratic behaviour becomes too much. 

Thanks,
Sam

On Thursday, June 29, 2023 at 5:40:12 AM UTC+1 Ben Noordhuis wrote:

> On Thu, Jun 29, 2023 at 2:02 AM Andrey Sidorov <andrey....@gmail.com> 
> wrote:
> >
> > Hi,
> >
> > Crossposting from node, issue 
> https://github.com/nodejs/node/issues/48581
> >
> > We have a CPU spike in a node process while no JS is being executed. The 
> time is likely spent in an optimiser thread.
> >
> > Steps to reproduce: run first script in 
> https://gist.github.com/sidorares/128160e6b3dea1da3ad45cd672651d2d#file-repro1-js
>  
> and watch CPU at 100% for quite some time until the process exits.
> >
> > Stacktrace:
> >
> > * thread #5 * frame #0: 0x0000000101625d24 
> node`v8::internal::compiler::LoadElimination::AbstractField::Kill(v8::internal::compiler::LoadElimination::AliasStateInfo
>  
> const&, v8::internal::MaybeHandle<v8::internal::Name>, v8::internal::Zone*) 
> const + 68 frame #1: 0x00000001016287cb 
> node`v8::internal::compiler::LoadElimination::AbstractState::KillFields(v8::internal::compiler::Node*,
>  
> v8::internal::MaybeHandle<v8::internal::Name>, v8::internal::Zone*) const + 
> 107 frame #2: 0x0000000101623484 
> node`v8::internal::compiler::LoadElimination::ReduceStoreField(v8::internal::compiler::Node*,
>  
> v8::internal::compiler::FieldAccess const&) + 900 frame #3: 
> 0x000000010155578a 
> node`v8::internal::compiler::Reducer::Reduce(v8::internal::compiler::Node*, 
> v8::internal::compiler::ObserveNodeManager*) + 26 frame #4: 
> 0x00000001016931e9 node`v8::internal::compiler::(anonymous 
> namespace)::SourcePositionWrapper::Reduce(v8::internal::compiler::Node*) + 
> 57 frame #5: 0x00000001015565aa 
> node`v8::internal::compiler::GraphReducer::Reduce(v8::internal::compiler::Node*)
>  
> + 154 frame #6: 0x00000001015560f5 
> node`v8::internal::compiler::GraphReducer::ReduceTop() + 613 frame #7: 
> 0x0000000101555c38 
> node`v8::internal::compiler::GraphReducer::ReduceNode(v8::internal::compiler::Node*)
>  
> + 216 frame #8: 0x0000000101693dee 
> node`v8::internal::compiler::LoadEliminationPhase::Run(v8::internal::compiler::PipelineData*,
>  
> v8::internal::Zone*) + 718 frame #9: 0x000000010168501b node`auto 
> v8::internal::compiler::PipelineImpl::Run<v8::internal::compiler::LoadEliminationPhase>()
>  
> + 123 frame #10: 0x00000001016818f7 
> node`v8::internal::compiler::PipelineImpl::OptimizeGraph(v8::internal::compiler::Linkage*)
>  
> + 455 frame #11: 0x00000001016814fe 
> node`v8::internal::compiler::PipelineCompilationJob::ExecuteJobImpl(v8::internal::RuntimeCallStats*,
>  
> v8::internal::LocalIsolate*) + 142 frame #12: 0x000000010034a01b 
> node`v8::internal::OptimizedCompilationJob::ExecuteJob(v8::internal::RuntimeCallStats*,
>  
> v8::internal::LocalIsolate*) + 43 frame #13: 0x00000001003778e3 
> node`v8::internal::OptimizingCompileDispatcher::CompileNext(v8::internal::TurbofanCompilationJob*,
>  
> v8::internal::LocalIsolate*) + 35 frame #14: 0x0000000100378359 
> node`v8::internal::OptimizingCompileDispatcher::CompileTask::RunInternal() 
> + 425 frame #15: 0x000000010015304a node`node::(anonymous 
> namespace)::PlatformWorkerThread(void*) + 362
> >
> > Is this a known issue/bug?
> > Any hints on 1) how to reduce repro example even more 2) what is causing 
> the issue 3) any flags in node to test with optimiser on/off?
> >
> > Thanks,
> > Andrey
>
> I forgot to mention it in the issue but the delay goes away when I
> disable Turbofan with --noopt or --max-opt=2.
>
> Tested V8 version is 11.3.244.8-node.9.
>

-- 
-- 
v8-dev mailing list
v8-dev@googlegroups.com
http://groups.google.com/group/v8-dev
--- 
You received this message because you are subscribed to the Google Groups 
"v8-dev" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to v8-dev+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/v8-dev/7dc6b882-1cb8-4921-ad01-418c38d20714n%40googlegroups.com.

Reply via email to