Are you saying we *can* import categories into multiple category groups with your update? Great, if yes! I’m not clear on what the separator is… if pipe ‘|’ doesn’t work, what should be used? What’s the purpose of “Category_delimiter…”?
Nick - No, my fix does not do multiple category groups. The original CSVgrab can put a single item into multiple categories, as long as they are all in the same group. To separate the multiple categories, you use the delimiter, which defaults to “,”. You can specify a different delimiter, but due to the way the original CSVgrab split up the list of categories, “|” did not work (because it’s a meta-character in regular expressions). I just used a different splitting mechanism (explode() instead of split() ).
Andrew,
do you think you could add a little note about the multibyte thing in the documentation (like the tip you have for the URLs in the control panel). I got bitten by it today and 18 pages is harder to pour through than a single page. It seems to be common enough.
Thanks.
I also got bitten by the new line, \n, mac thing. Might also be worthy of a note.
Is there a way to run CSVgrab templates (and perhaps other EE scripts) through the command line using PHP instead of cURL?
I’m using CSVgrab to manage a very large amount of data. The cURL is timing out and using too much CPU for mySQL. Our host said that using PHP to run the commands would expedite things and make them more efficient simply by removing it from the web environment.
Has anyone here ever needed to make adjustments like these or noticed any thread talking about anything like this?
Thanks so much.ccb.
I am having a torrid time getting the plugin to just read my file?? Pretty mystified.
Debug Checklist:
• Permissions are all good, file is readable by all • fopen is switched on in php.ini • I think my line endings are fine - I’m on a mac, but I exported from phpMyAdmin in Windows Xcel csv format (how do I confirm this?) • Here’s my csv file
I have the following in my template:
{exp:csvgrab
url="http://bestafricatravel.com/data/accom_province.csv"
weblog="2"
title="1"
delimiter=","
encloser="QUOTE"
use="2|3|4|5"
fields="url_title|location_lat|location_long|location_zoom_level"
category_group="2"
category="23"
author_id="1"}
Any help would be awesome, thanks!
By the way, I should mention the error I’m getting is as follows:
Warning: fopen(http://bestafricatravel.com/data/accom_province.csv) [function.fopen]: failed to open stream: HTTP request failed! HTTP/1.1 404 Not Found in /[doc_root]/[system_folder]/plugins/pi.csvgrab.php on line 192
Also, I opened my .csv up with the Terminal in VI and the line endings are all good for Unix consumption.
I’m using CSV grab to bulk-import names of people in many countries, particularly in France.
Every time there’s a ‘é’, or a ‘ç’, the import translates this to gobeltygook and that’s how it arrives in the database.
I’ve modified my regex.core file (this is actually an exterior one to the system - so that these grabs can be called with PHP) to skip ISO encoding - my files are in UTF-8 and the ISO was throwing an error. Now that its processing in UTF-8 automatically, the error isn’t thrown, it just records incorrect info.
Is anyone able to tell me which settings to check, or point me to resources to learn more about this utterly confusing issue?
Thanks much, .ccb.
Hi and thanks for this great plugin, it saved my day for a blog transition 😊
I’m having issues with accents and url title. Characters such as é è à are just removed from the url_title but I’d rather have “e” and “a” instead of nothing.
Any ideas how I could make this work ? Maybe there’s a way to specify it manually in the same way that it’s done with the title ? As I have an SQL database as source, I can write a bit of PHP to handle the url title accents problem my self before exporting the data.
Thanks again !
EDIT 2: I think I found the issue… I just had to re-read this thread… again. http://ellislab.com/forums/viewthread/41783/P126/#420736
Can anyone help me figure out what issues could cause a CSVGrab test to work on one server but not on another?
I can’t get a very basic test to work on one server but on a different server it works just fine. I’m using the exact same template code and csv file on both servers so I’m pretty sure it’s not any of those issues. And “allow_url_fopen” is set to ON for both servers so that’s not it.
Here are some possible ideas… * Could it be that I am using Gypsy fields on the install that isn’t working (but not on the fields I”m trying to import in for the test) * Could it be an extension conflict?
There are 3 test entries in the import file but the trace stops here…
TRACE: Resource id #68
TRACE: 1 - Array ( [0] => Test Entry [1] => Test Entry [2] => This is a test )
EDIT - I now see an error message at the end of the trace info… does that help?
Fatal error: Call to undefined function mb_convert_encoding() in /home/sitename/public_html/system/plugins/pi.csvgrab.php on line 231
EDIT
I finally realized that the problem was that although I assigned Gypsy fields to the target weblog, I had not assigned a Field Group to the weblog I was trying to import into. Once I did that, I successfully imported some test entries.
–
It turns out that when I try an import into a weblog that is using Gypsy fields, the data is imported fine into the exp_weblog_titles table but NOT in the exp_weblog_data table.
I just tried a test import on a different weblog on this same site that is NOT using Gypsy fields and it worked fine.
Has anyone else had a problem importing into Gypsy fields? Is there a workaround?
I too am getting the famous error:
Fatal error: Call to undefined function: mb_convert_encoding() in /home/.babbaboombanorsh/aaa/aaa.com/system/plugins/pi.csvgrab.php on line 231
I’m on dreamhost, and honestly, I dont expect them to do much, if anything at all. I have asked them to turn on the mbstring…
Is there anything else I can do on my own to make this work? I was really looking forward to testing, but have lost several hours reading and trying and nothing has worked.
Anyone else here using dreamhost? having success? failure? please share your experiences, thanks
Thanks in advance,
Packet Tide owns and develops ExpressionEngine. © Packet Tide, All Rights Reserved.