Skip to content
Advertisement

Tag: hibernate-search

Single return type in Hibernate Search

Let’s say I have an app with many different entities, which do not have relations between each other. I would like to create a search that queries all of them, but returns a unified type, i.e: So my idea was to index the entities and put their values into one single index (with same index fields): As you can see,

@IndexedEmbedded on class containing other @IndexedEmbedded fields

I’m trying to manage hibernate search indexing on a class with a field mapped by @IndexedEmbedded on a custom @Embeddable entity. This entity also contains others @IndexedEmbedded fields in the @MappedSuperclass. These are the entities involved: And these classes: When running the application, Hibernate Search throws the following error: I’m running it with the following dependencies: Any suggestion on how

Hibernate Search 6: Methods mapping

In Hibernate Search 5.x I can map entity method as the fulltext field like this: Then I have a field with name “wholeName” in my fulltext index and it contains return value of getWholeName() method. How to do it in Hibernate Search 6? I found only a way how to map an entity field but not a method. Thank you!

Can a Hibernate Search FieldBridge configure facets for dynamic fields?

Using Hibernate Search 5.11.3 with programmatic API (no annotations), is there a way to facet on dynamic fields added in a class or field bridge? I don’t see any ‘facet’ config available in FieldMetadataBuilder when using MetadataProvidingFieldBridge. I have tried various combinations of luceneOptions.addSortedDocValuesFieldToDocument() and luceneOptions.addFieldToDocument() in the set() method. This successfully updates the index, but I cannot perform facet

Why is Lucene sometimes not matching InChIKeys?

I have indexed my database using Hibernate Search. I use a custom analyzer, both for indexing and for querying. I have a field called inchikey that should not get tokenized. Example values are: BBBAWACESCACAP-UHFFFAOYSA-N KEZLDSPIRVZOKZ-AUWJEWJLSA-N When I look into my index with Luke I can confirm that they are not tokenized, as required. However, when I try to search them

Advertisement