We are starting from:

Your Solr server is up and running, but it doesn’t contain any data yet, so we can’t do any queries, but we will use our own data and create our own index.




Quick recap:

Start cloud mode: bin\solr.cmd start -e cloud

We will create a collection called book_store with configuration _default. Ie following the parameter in tutorial 2, but changed the name and the configuration.

We start with 1 node on default port

bin\solr.cmd start -e cloud

(if the collection is existing just type for 1 for use existing, stop node1, bin\solr.cmd stop -p 8983)

1 node, running on default port 8983, shared is 1, replicas is 1.

We will keep this to just 1 and simple.

Our node1 is empty now:


Here the index will come after we have create it with the parameters above.

Now we can visit the collection in the gui and also our index is created, but it is empty:



In the gui, we select our collection, and lets insert some items:

We will use the query tab, lets look at out index:

The query we make will also give us the http query to paste in a browser:

select all from collection:



Lets insert one json book manually, since we do not have the DataImportHandler defined.

We insert in the documents tab and submit.


Lets run the select again:

And there is our first book in the index cool.

Ok, we will stop Solr:

bin\solr.cmd stop -p 8983

Starting up again is just same steps as in the begining, and you could also provide the paramters in the bat file

Now, lets insert some more documents and add a custom field to default schema.xml.

We will add a string field tag:

Now lets insert a new book with the custom field ( the two other books is already updated)

Lets paste the query in a browser( could also use the admin):

http://localhost:8983/solr/book_store/select?q=*:* and view our result:

And here it is, looks good!

Now lets delete a doc,



And now we only have two documents:


Then we have done insert, updated and deleted documents.

TBC (add python with DataImportHandler)