Upload Error: 401
[30-Sep-2009 07:57:35] PHP Warning: join() [<a href='function.join'>function.join</a>]: Invalid arguments passed in /home/username/public_html/concrete/tools/files/importers/uploaded.php on line 15
Does anyone know why I'm getting this error?
It's not to do with the upload limit which is 64M.
Would be useful to know as I'll be able to troubleshoot better.
<?php ?> <div style="text-align:center"> <div style="margin:24px 0px; font-weight:bold"> <?php echo count($_REQUEST['fID'])?> file<?php echo (count($_REQUEST['fID'])!=1)?'s':''?> uploaded successfully. </div> <div style="margin-bottom:24px"> <a onClick="ccm_filesApplySetsToUploaded([<?php echo join(',',$_REQUEST['fID'])?>]);">Assign File Sets</a> | <a onClick="ccm_filesApplyPropertiesToUploaded([<?php echo join(',',$_REQUEST['fID'])?>]);">Edit Properties</a> </div> [ <a onClick="jQuery.fn.dialog.closeTop()"><?php echo t('Close Window') ?></a> ] </div>
$ chown -R ftpuser.apache files/
$ chmod -R ug+rwX files/
works for me...
I have this as the permissions for my /files folder:
Only Group and World can't write to this directory.
Is that how it should be? It's the chmod as a locally-installed site I'm working on and the multiple file upload works for that...
Hope that makes sense...
I guess it's something to do with my set up - couldn't be page type template or theme related could it?
Just in case anyone runs into this problem, too.
This is the php-error.log
PHP Warning: join() [<a href='function.join'>function.join</a>]: Invalid arguments passed in C:\inetpub\wwwroot\concrete\tools\files\add_to_complete.php on line 20
I am using IIS7 on win2008 server...I was having issues with single uploads getting an invalid file message, kind of solved by setting the open_basedir = C:/ instead of C:/netpub/wwwroot
Turned out that it was the .htaccess password protection I had set up on the development server. Once that is removed, it worked fine with 0755 permissions (although you may need to g 0777 depending on your host)
Hope this helps someone!
For anyone have issues with this, we've been adding an exception for the /files directory in the virtual host file - I think you can do a similar thing in the .htaccess file but can't use the <Directory> derective - see here:http://perishablepress.com/enable-file-or-directory-access-to-your-...
<VirtualHost *:80> ServerName website.dev.com DocumentRoot /home/website/public_html/ # Send to dev server's robots file which contains disallow all Alias /robots.txt /home/robots.txt <Directory /home/website/public_html/ > Options -Indexes FollowSymLinks MultiViews AllowOverride All Order allow,deny Allow from all # PASSWORD PROTECT WEBSITE ON DEV SERVER # Requires the .htpasswd file created during dev server setup # ----------------------------------------------------------------------- AuthUserFile /home/.htpasswd AuthType Basic
Make sure to reload apache after adding it to update the virtual host files.
No biggie as we'll just use Webkit for the dev server.