Skip to content

Commit

Permalink
feat: search bar component (#95)
Browse files Browse the repository at this point in the history
Initial work on the search-bar component

we have a search box and a button. It can be tested locally on http://search.localhost:8000/static/off.html.

At the moment it just send the request, it does not display the results (next results components).
  • Loading branch information
alexgarel authored Apr 16, 2024
1 parent 339132a commit 6eaa132
Show file tree
Hide file tree
Showing 23 changed files with 2,862 additions and 151 deletions.
2 changes: 1 addition & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -118,4 +118,4 @@ dmypy.json
/frontend/node_modules/
/frontend/lib/
/frontend/test/
/frontend/public/
/frontend/public/dist/
12 changes: 9 additions & 3 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -91,9 +91,15 @@ check:
# note: this is called by pre-commit
check_front: _ensure_network
${DOCKER_COMPOSE} run --rm -T search_nodejs npm run check
lint:
@echo "🔎 Running linters..."

lint: lint_back lint_front

lint_back:
@echo "🔎 Running linters for backend code..."
pre-commit run black --all-files

lint_front:
@echo "🔎 Running linters for frontend code..."
${DOCKER_COMPOSE} run --rm search_nodejs npm run format

#-------#
Expand Down Expand Up @@ -125,7 +131,7 @@ guard-%: # guard clause for targets that require an environment variable (usuall

import-dataset: guard-filepath
@echo "🔎 Importing data …"
${DOCKER_COMPOSE} run --rm api python3 -m app import /opt/search/data/${filepath} --num-processes=2
${DOCKER_COMPOSE} run --rm api python3 -m app import /opt/search/data/${filepath} ${args} --num-processes=2


#-------#
Expand Down
4 changes: 4 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -79,6 +79,10 @@ If you get errors, try adding more RAM (12GB works well if you have that spare),

Typical import time is 45-60 minutes.

If you want to skip updates (eg. because you don't have a Redis installed),
use `make import-dataset filepath='products.jsonl.gz' args="--skip-updates"`


## Fundings

This project has received financial support from the NGI Search (New Generation Internet) program, funded by the European Commission.
Expand Down
4 changes: 3 additions & 1 deletion app/_import.py
Original file line number Diff line number Diff line change
Expand Up @@ -405,6 +405,7 @@ def run_full_import(
num_processes: int,
config: IndexConfig,
num_items: int | None = None,
skip_updates: bool = False,
):
"""Run a full data import from a JSONL.
Expand Down Expand Up @@ -447,7 +448,8 @@ def run_full_import(
with Pool(num_processes) as pool:
pool.starmap(import_parallel, args)
# update with last index updates (hopefully since the jsonl)
get_redis_updates(es_client, next_index, config)
if not skip_updates:
get_redis_updates(es_client, next_index, config)
# make alias point to new index
update_alias(es_client, next_index, config.index.name)

Expand Down
5 changes: 5 additions & 0 deletions app/cli/main.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,10 @@ def import_data(
dir_okay=False,
help="Path of the JSONL data file",
),
skip_updates: bool = typer.Option(
default=False,
help="Skip fetching fresh records from redis",
),
num_processes: int = typer.Option(
default=2, help="How many import processes to run in parallel"
),
Expand Down Expand Up @@ -63,6 +67,7 @@ def import_data(
num_processes,
index_config,
num_items=num_items,
skip_updates=skip_updates,
)
end_time = time.perf_counter()
logger.info("Import time: %s seconds", end_time - start_time)
Expand Down
6 changes: 5 additions & 1 deletion confs/nginx.conf
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,10 @@ server {
# this is the internal Docker DNS, cache only for 30s
resolver 127.0.0.11 valid=30s;

# web server dev has specific urls, we want to redirect them to static
rewrite ^/(__wds-|__web-dev-server)(.*)$ /static/$1$2 last;


# Static files - in DEV = node server
location /static${DEV_UI_SUFFIX} {
proxy_set_header Host $host;
Expand All @@ -35,7 +39,7 @@ server {
set $search_nodejs search_nodejs;
# rewrite to get rid of /static
# double $ to avoid being interpreted by interpolation
rewrite ^/static${DEV_UI_SUFFIX}/(.*)$ /$1 break;
# rewrite ^/static${DEV_UI_SUFFIX}/(.*)$ /$1 break;
# important do not add a / at the end for node is picky on not having '//' in url
proxy_pass http://$search_nodejs:8000;
}
Expand Down
3 changes: 3 additions & 0 deletions docker/dev.yml
Original file line number Diff line number Diff line change
Expand Up @@ -42,6 +42,9 @@ services:
# by setting PROD_UI_SUFFIX to "" and DEV_UI_SUFFIX to "-dev"
PROD_UI_SUFFIX: "${PROD_UI_SUFFIX--static}"
DEV_UI_SUFFIX: "${DEV_UI_SUFFIX-}"
volumes:
# dynamic mount
- ./frontend/public:/opt/search-a-licious/public


# Node that create the webcomponents
Expand Down
2 changes: 1 addition & 1 deletion frontend/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ COPY --chown=node:node tsconfig.json /opt/search-a-licious/tsconfig.json
COPY --chown=node:node rollup.config.js /opt/search-a-licious/rollup.config.js
# build for production
# no need of a public url, we are at the root
RUN npm run build && npm run bundle
RUN rm -rf public/dist/* && npm run build && npm run bundle
CMD ["npm", "run", "serve"]

# nginx
Expand Down
Loading

0 comments on commit 6eaa132

Please sign in to comment.