Ticket #1036 (assigned enhancement)

Opened 14 years ago

Last modified 14 years ago

move search.py into a class

Reported by: tziade Owned by: tziade
Priority: P2 Milestone: CPS 3.5.7
Component: CPSDefault Version: TRUNK
Severity: normal Keywords:
Cc:

Description

refactoring: need to backport search.py into a class to be able to cut its complexity down

Change History

comment:1 Changed 14 years ago by tziade

see also #1035 for a use case

comment:2 Changed 14 years ago by fguillaume

  • Milestone changed from unspecified to CPS 3.4.0

comment:3 Changed 14 years ago by rspivak

I think we could add special utility tool like 'portal_utils' to hold persistent stuff and utillity code that needs to use cps portal facilities. Besides search.py outstanding slow getContenInfo.py also comes to my mind and i'm sure there are other candidates waiting to be moved out from scripts.

comment:4 Changed 14 years ago by tziade

Right,

For this particular point (searching):

IMO it's time for us to create our own CPSCatalog on the top of the base Catalog, and do a custom portal_catalog, were we can add all our custom search thing into that class.

Skins are hell to debug and to test

Opinions ?

comment:5 Changed 14 years ago by tziade

this can be done in 4 steps:

1/ create a subclass of Catalog, for portal_search

2/ add a search method to it, (that would be search.py)

3/ add unit tests

4/ refactor the method to cut down its complexity by creating submethods

comment:6 Changed 14 years ago by bdelbosc

What is exactly the problem here ?

search.py is less than 100 lines of code, it is a CPSDefault specific glue for python script and zpt, it keeps a stable interface and hide cpsdefault internal indexes usage.

From python code you should use directly the cmf portal_catalog with approriate cps indexes. I don't think this will help to provide another portal catalog tool interface.

I am for keeping this python script (or turn it into an unrestricted function) until CPS 3.5, as we will move to single-language proxies that will requires to re think the catalog usage.

comment:7 Changed 14 years ago by tziade

What is exactly the problem here ? :

the problem is everytime I have to debug things on search it's painfull because this is a skin and it's not unit tested IIRC. I want to unit test the catalog (search, index, etc) without having to deal with anything else.

search.py is less than 100 lines of code [...] specific glue [...] stable interface and hide [...]:

what's a "specific glue" for zpts ? how coding a skin make things more stable and more hidden ? if I change this skin' code, how do I know if there are regression ? this script is highly brakable IMHO

.. I am for keeping this python script [...]:

not me i think this is a mistake to keep skins or orphan standalone functions that are in fact extensions of a class (look at the script callees) and i think we should refactor all theses skins in class methods whenever we bump into debugging needs.

We did the same work on local roles skins because the skin was a huge brake to have a correct TDD approach to correct a bug.

It's obvious here: search.py just subclasses catalog.query() to add CPS specific things

comment:8 Changed 14 years ago by fguillaume

I'm all for subclassing the CMF Catalog Tool if someone deals with the migration issues (we must not reindex everything nor generate events).

Otherwise monkey patch it.

comment:9 Changed 14 years ago by tziade

  • Status changed from new to assigned
  • Owner changed from trac to tziade
  • Milestone changed from CPS 3.4.0 to CPS 3.4.1

I am moving it to 3.4.1 to work on a clean branch and control all migration paths.

comment:10 Changed 14 years ago by bdelbosc

again the script callee should be zpts only, nothing prevent you to unit test the catalog using the cps default index.

we are talking about 2 differents things here:

  • subclassing cmf portal catalog to add our proxy indexing capabilities in CPSCore is fine and better than monkey patching.
  • providing a search function to uses our cpsdefault specific index easily from a zpt, this is a glue between zpt simple query and the catalog query using cpsdefault indexes.

Moving glue function to a class is not obvious as it is going to provide a redundant api and it is subject to change on each project.

comment:11 Changed 14 years ago by tziade

<<<<< again the script callee should be zpts only, nothing prevent you to unit test the catalog using the cps default index. <<<<< Why that ? does it mean the seach api can't be used in the code, or in a RPC call for example ?

providing a search function to uses our cpsdefault specific index easily from a zpt, this is a glue between zpt simple query and the catalog query using cpsdefault indexes.

Moving glue function to a class is not obvious as it is going to provide a redundant api and it is subject to change on each project.

I don't get what's a glue function then. In search.py, there are some manipulations on the query, a few calls to two catalog methods. that's it.

if it changes on each project, it means we'll have some extra property on the catalog tool to customize the behavior like what exists for instance in TextIndexNG text indexer. The setup then can define the catalog properties value. Maybe an extra "onBeforeQuery" event can be settled to hook a script for query postprocessing, for example, like what exists in CPSSchemas.

But i don't see the gain of making the search chain dependant on a skin. Do you have an example ?

comment:12 Changed 14 years ago by bdelbosc

I agree that a python script in a skin should not be an api, that is why the only api to use is the portal_catalog itself, for ex searching for 'foo' is just:

  portal_catalog(cps_filter_sets='searchable', SearchableText='foo')

search.py is something that:

  • convert user search form submit (possibily dirty) into clean query to the portal catalog
  • keep backward compatibility with old parameter values for old projects
  • use approriate indexes that are subject to be changed more ofen than the zpt
  • make it easy to customize the search result page using others document set, indexes, catalog or whatever

Perhaps I am wrong but I call this glue code.

If you want to add extra property on the catalog to add some customized indexes by default and hide the knowledge of internal indexes, I am fine with that this will factorize the 'cps_filter_set' part. But I don't think this will remove the need of glue code to make the link between zpt and the portal tool.

btw there is only one call to the portal_catalog api in search.py ;)

comment:13 Changed 14 years ago by tziade

in what you are saying, i still don't see any reason that makes the code having to be a skin, (but the backward compatibility issue of course)

---

btw there is only one call to the portal_catalog api in search.py ;):

portal_path = '/' + catalog.getPhysicalPath()[1] + '/'  <- call number #1

brains = catalog(**query) <- call number #2

;)

comment:14 Changed 14 years ago by sfermigier

  • Type changed from defect to enhancement
  • Milestone changed from CPS 3.4.1 to CPS 3.4.2
Note: See TracTickets for help on using tickets.