ExpressionEngine CMS
Open, Free, Amazing

Thread

This is an archived forum and the content is probably no longer relevant, but is provided here for posterity.

The active forums are here.

DataMapper 1.6.0

September 05, 2008 12:32pm

Subscribe [115]
  • #136 / Oct 11, 2008 2:43pm

    Maxximus

    55 posts

    For those who use KHCache (a very nice caching system), and want to cut on the expensive list_fields method, change line 105 in DataMapper to:

    // Get and store the table field names
            if (KH_CACHE) {
                $key = $this->khcache->generatekey(__CLASS__, __FUNCTION__, $this->table);
                if (($data = $this->khcache->fetch($key)) !== false) {
                    $this->fields = $data;
                }
                else
                {
                    $this->fields = $this->db->list_fields($this->table);
                    $this->khcache->store($key, $this->fields);
                }
            }
            else
            {
                $this->fields = $this->db->list_fields($this->table);
            }

    To use KHCache for your other queries too, there is a bit more needed, but if it really works it would be great.

    *Rethinking* Perhaps better to alter the AR cache method…

    For that reason I created a drop-in replacement for DB_cache.php. See KHCache.

  • #137 / Oct 11, 2008 8:09pm

    stensi

    109 posts

    @OverZealous.com: I can see what you’re trying to achieve with that code and it’s a good idea.  Sending only the changed fields in an UPDATE would indeed save on the bandwidth and processing time over a longer period.

    I’ve started testing a solution for this which is a little different to yours but with the same result (uses the existing _changed functions).  If it tests ok, I’ll include it in the next version.  I’ve got a very large list of things I want to include/improve now, which keeps growing, so it’s now just a matter of having the time to get around to it.

    UPDATE

    I’ve finally got a working solution for this that required very few changes.  When you do a save to an existing object (update), only the changed fields will be sent in the Database query, making it much more efficient (thanks for pestering me to do this, lol!).

    When saving a new object (insert), it will send everything except the id (which you should have as an AUTO INCREMENT field).

    These changes should solve all the issues you were having with your PostgreSQL insert/update queries.  It works fine for MySQL as well.


    @Maxximus: Caching for things like that is definitely something I want to look at including in a later version and Neophyte’s KhCache is an awesome library for that sort of thing.  I’m not all that happy with the AR caching.

    I haven’t seen or used the glob() function before.  Would it be more efficient for me to use that instead of the recursive_require_once I’ve created?

    Oh, and I’ll be including this code of yours in the next version as well 😉

    /* don't try to autoload CI_ or MY_ prefixed classes */
    if (strstr($class, 'CI_') OR strstr($class, 'MY_')) return;

    UPDATE

    So far, the next version just has the changes I’ve mentioned in this post.  I might release that version later tonight and then go back through all the forum posts and compile a full list of requested additions/features, stick those in the list of things I’m already wanting to do, and then sort it out in order of most useful.

    I’ll probably post the top 10 items from that list so everyone can give some feedback on whether they would like them added in the way I envision or if they’re prefer it done differently.

  • #138 / Oct 12, 2008 9:37am

    stensi

    109 posts

    Version 1.4.2 has been released!

    View the Change Log to see what’s changed.

    In short, I improved the performance of the save method when updating an existing object, so it only updates the changed data.  The autoload method no longer attempts to autoload CI_ or MY_ prefixed classes (so no recursive searching for classes that definitely wont exist in the models folder).

    As I said above, I’ll now be taking a look at all the requested additions/features people have posted here.  However, I wont have as much time in the coming weeks since I’m going to Japan with my wife on holiday 😊

    In the mean time, I welcome any new suggestions/feature requests for what you’d like to see in a future version of DataMapper!

  • #139 / Oct 12, 2008 9:50am

    nfx-nano

    27 posts

    Hello, and ‘grats on an excellent library (+docs).

    However, I believe it’s missing an important function.

    Example: user Fred has 100 computer items in stash.

    The join table:

    id - user_id - item_id - amount

    There is no current way DataMapper can access the field amount from this join table. Sometimes I need to be able to add additional fields in the join table, and it would be great to be able call those fields with DataMapper, as such:

    // Get Fred
    $user = new User();
    $user ->where('user_name', 'Fred')->get();
    
    // Get the computer items
    $user->item->where('item_name', 'computers')->get();
    
    // Show amount of computers related to Fred
    echo '' . $user->item->amount . '';

    I wouldn’t mind being able to do this either:

    // Get the related items
    $user->item->where('amount', '100')->get();
  • #140 / Oct 12, 2008 4:22pm

    ntheorist

    84 posts

    hi, thanks for this great library!

    forgive me if this topic has been covered before.. regarding the validation, is there a way to bypass certain validation rules if the field is not required?

    for instance:

    array(
         'field' => 'zip',
         'type'  => 'numeric',
         'label' => 'Zip',
         'rules' => array('trim','numeric')
    ),
    array(
         'field' => 'email',
         'type'  => 'email',
         'label' => 'Email',
         'rules' => array('trim','valid_email','max_length'=>120)
    ),

    Neither field has ‘required’ in their rules, but because they fail the numeric and valid_email tests with empty values, the datamap doesn’t save and an error is generated, therefore essentially requiring them. But of course, i don’t want to remove the numeric/valid_email tests altogether..

    I suppose i could try having the validation test check if there’s a value, and if not then it would suppress all tests besides ‘required’, but before i go messing up your code i thought i’d ask here, since this issue may have been resolved already and i just couldn’t find it.

    thx,

    CC

  • #141 / Oct 12, 2008 5:01pm

    Maxximus

    55 posts

    Good work, hope you have a nice holiday!

    One thing about performance:

    Unfortunately doing a $Obj->get() seems to call the construct for each record again (_to_object does new $model).

    Because of this it calls _assign_libraries() every record again. This was also true for the ‘old’ Module version by the way. 

    This adds up pretty much for more than 20 records. Currently I see two options to kill this CPU and memory hog: Perhaps its not needed to call new Model() every time, and/or replace _assign_libraries() with only the needed ones (DB and config).

    Suggested new _assign_libraries():

    function _assign_libraries()
    
    {
    
       $CI =& get_instance();
    
       $this->config = $CI->config;
    
       $this->db = $CI->db;
    }

    Its probably always possible to add additional needed libs in the model itself.

    *UPDATE*
    This might do the trick in _to_object:

    if (get_class($item))
                {
                    $item = clone $item;
                }
                else
                {
                    $item = new $model;
                }
  • #142 / Oct 12, 2008 6:56pm

    stensi

    109 posts

    @siric: You can still do what you’re wanting with the current version, but it means having your tables setup differently, most likely with another normal table (such as orders) and joining table.  There’s a number of ways you could do it actually, depending on your preference, but yes, it will mean having more tables.

    I originally never planned on adding other fields into the joining tables since those were meant to hold only the relationship data and nothing more.  For situations like the one you mentioned, and the one Phil (OverZealous) mentioned a few pages back, being able to have extra fields in the joining tables would indeed be handy.  I’m just not sure how easy it will be, if at all possible, to implement with the same usage as is currently used to access related data.  I mean, when doing this:

    // Get the related items
    $user->item->where('amount', '100')->get();

    If we were to allow fields on the joining table as well, from the above code it would be very difficult to distinguish whether the developer is wanting to access the “amount” field in the items_users joining table, or in the items table.  And what happens if they have multiple where clauses, ones where they’re doing a mix of having where clauses for fields on the joining table as well as the related table:

    $user->item->where('amount', '100')->where('name', 'hammer')->get();

    I think that would be a nightmare for me to cater to, lol.  If I add in the ability to have other fields on relationship tables, the usage would have to be different for those fields.  Maybe this?

    $user->item->join_where('amount', '100')->where('name', 'hammer')->get();

    How’s that sound?  Hmm, how to save and update that data though… That’s going to be difficult.  I can see myself getting headaches from this one, lol 😉


    @commandercool: Ah, nice find.  I hadn’t noticed that.  I’ve made a change that will solve this issue.  If a field is not required, it will only have its validation rules applied if it has a value.  That one’s a quick fix so I’ve got it in the next version.


    @Maxximus: Thanks 😊 and good point.  I’ll run some tests to see which approach saves the most CPU time/Memory and let you know my findings.

  • #143 / Oct 12, 2008 7:54pm

    OverZealous

    1030 posts

    Regarding the joining table values:

    I think it might make sense to set one limitation, which then makes it a lot easier.  Declare that a joining field cannot have a field with the same name as the fields in either of the tables it joins.

    When a related model is set up, all of the extra fields in the joint table are added to a second fields array (if necessary), that I’ll call joint_fields.

    This way, when the join is queried, joint fields can be added automatically to the child models (meaning, in the example above, you could access the joint fields either through $user->book->amount or through $book->user->amount (or ->all[$index]->amount).  Joint fields could be added in the select statement.  The developer could set the joint field in the exact same manner.

    Then, when performing updates, the joint fields could be checked during _related saves, and saved into the appropriate table.

    To handle the ->where(), the field is checked to see if it exists in the fields[] or joint_fields[] array, and set accordingly.  Although, I actually like the join_where better than this, especially as far as work for stensi goes 😜.

    Also, @ stensi, I like most of the updates (I haven’t had time to integrate them) but one problem still exists (I’m sorry!).  Since the inserts still send empty fields, there is still no way (at least on PostGreSQL) to have a default, unset value.  PG simply sees the field as ‘’, which errors on most field types, and ignores default field types.  And since the _clear() method is called on creation of the model, any defaults set in the model are wiped out.

    (Example of where this is important: I often include a field called hidden, which is used to allow the user to hide a row without deleting it.  Since you never create a non-hidden value, I simply set the default to FALSE in the DB, and ignore it during inserts.)

    To me, it still makes sense to filter the insert values.

    - Phil DeJarnett

  • #144 / Oct 12, 2008 8:34pm

    stensi

    109 posts

    I’ll have a more indepth look at how I’ll implement the fields within joining tables when I get back from Japan.  My only concern is that I probably will have to do a list_fields() of the joining tables, which isn’t very good for performance.  Also, to enforce the rule that a joining table cannot have the same fields as either of the tables it’s joining, will require a list_fields() call of all three tables 😕

    With your insert issue, will unsetting the properties during _clear(), then only including those that are set in the insert query solve this issue for you?  The thing is, I still need to insert a value if it is NULL, 0 (zero), or an empty string, since those are valid Database values.  I know I’ll want to be inserting with those.

    I guess it’ll depend on whether PHP considers an unset properties value to be different to a property set to NULL.  If it does, then the above changes should be do-able.

  • #145 / Oct 12, 2008 9:41pm

    OverZealous

    1030 posts

    Yes, unsetting during clear and only including those that are set is perfect.  I think I wrote something in similar to that on an earlier post.

    An unset() property (tested with isset()) is different that empty().  isset() only returns false for values that are not there, whereas it will return true for NULL, 0, etc.  It’s in the PHP isset documentation.

    If you did that, my problems would be all gone.  😊

  • #146 / Oct 12, 2008 9:49pm

    stensi

    109 posts

    Hmm, that’s not true according the the PHP documentation: isset()

    It say’s isset() will return FALSE if the variable is set to NULL so that wont work.  I still want NULL’s included.  I could try this:

    // Only include if field isset or is null
    if (isset($this->{$field}) OR is_null($this->{$field}))
    {
        // include in $data
    }

    But that will depend on whether is_null returns TRUE only for NULL fields.  If it considers unset fields to be NULL as well, then that wont work either as it’ll be the same situation.  I’ll test is_null to find out.

    UPDATE

    Nope, that’s not going to work either :down:

    Although… if a Database field accepts NULL but no value is supplied to it, it will default to NULL, right?  Likewise, a varchar field with no characters supplied will default to an empty string.  If this is right, then I can go ahead and only include fields that pass the isset() check.  Your thoughts?

  • #147 / Oct 12, 2008 11:19pm

    OverZealous

    1030 posts

    I apologize - I’ve been wrong about that for some time :red: !

    Why doesn’t that work?

    Also, you probably need to be able to set a field to NULL on purpose.  Meaning, I need to be able to NULL out a value, if it was previously set to a value.  In that case, isset() on its own will not work.

    Update: Duh, now I see why.  Hmm… Must think this through.

    Update 2:  I guess you could re-look at my above suggestion to keep track of changes from within __set().  It would handle changes made for both updates and inserts.

  • #148 / Oct 12, 2008 11:34pm

    stensi

    109 posts

    There’s no issue with an UPDATE query, since if you set the value to NULL and it’s not NULL in the Database, it will recognise that you’re changing this value and will include it in the UPDATE query.

    INSERT is a little different though since it has nothing to compare against like UPDATE does, so yes, it’s a lot trickier.  Still, I think I can leave NULL or empty string values out of the $data array since those should default to NULL or empty string when inserting without them.

  • #149 / Oct 12, 2008 11:40pm

    OverZealous

    1030 posts

    Ah, I see now, you’re right.  There’s no reason to insert a NULL ever, but NULL updates are fine, etc, etc.  I need to not read so late at night :cheese: .

    Yes, then, I would go with the isset() check for inserts (which also should dump your [id] automatically), and update _clear() to default the values to NULL.  This should fix everything that can be fixed with PGSQL, and reduce DB overhead a tiny bit.

    Thank you for your attentiveness on this!  I have an important, really big project that I’m starting, and DM is going to dramatically reduce my development time!!  I can’t thank you enough for your time and development effort.  If I weren’t poor as <expletive>, I’d send you money 😛 .

  • #150 / Oct 12, 2008 11:46pm

    stensi

    109 posts

    No problem :lol:

    I’ve got a lot of upcoming projects as well that will be using DataMapper, which is why I’m happy to put in time to improve it as much as possible, before I start on them.  I’m very thankful for people like yourself who take the time to suggest and help work through the problems, so thank you as well 😊

.(JavaScript must be enabled to view this email address)

ExpressionEngine News!

#eecms, #events, #releases