Upload Error: 401

I'm getting this error when trying to upload several 10-15k files via C5 on my live server. Error log shows this:

[30-Sep-2009 07:57:35] PHP Warning:  join() [<a href='function.join'>function.join</a>]: Invalid arguments passed in /home/username/public_html/concrete/tools/files/importers/uploaded.php on line 15

Does anyone know why I'm getting this error?

It's not to do with the upload limit which is 64M.



View Replies:
osu replied on at Permalink Reply
Am I the only one who-s had this problem?

Would be useful to know as I'll be able to troubleshoot better.


jbx replied on at Permalink Reply
Something looks wrong with your file based purely on what I can see in that error message. Could you paste the code from that file please (/concrete/tools/files/importers/uploaded.php). It's only a small file. Just wanna make sure that what you have in there is correct...

osu replied on at Permalink Reply
Thanks for taking a look Jon, much appreciated - could it be a permissions issue?:

<div style="text-align:center">
   <div style="margin:24px 0px; font-weight:bold">
      <?php echo count($_REQUEST['fID'])?> file<?php echo (count($_REQUEST['fID'])!=1)?'s':''?> uploaded successfully.
   <div style="margin-bottom:24px">
      <a onClick="ccm_filesApplySetsToUploaded([<?php echo join(',',$_REQUEST['fID'])?>]);">Assign File Sets</a> 
      <a onClick="ccm_filesApplyPropertiesToUploaded([<?php echo join(',',$_REQUEST['fID'])?>]);">Edit Properties</a>
   [ <a onClick="jQuery.fn.dialog.closeTop()"><?php echo t('Close Window') ?></a> ]
jbx replied on at Permalink Reply
The file looks fine - make sure that your entire /files directory is writeable by the webserver.

$ chown -R ftpuser.apache files/
$ chmod -R ug+rwX files/

works for me...
osu replied on at Permalink Reply
Hi jbx,

I have this as the permissions for my /files folder:

755 (rwxr-xr-x)

Only Group and World can't write to this directory.

Is that how it should be? It's the chmod as a locally-installed site I'm working on and the multiple file upload works for that...
jbx replied on at Permalink Reply
Basicaly, the user that you're webserver runs as (apache in my case, but could be different) needs write access to that directory. Very often either root or the ftp user is the owner and that wont allow write access for the webserver, which would cause a problem with file uploads...

Hope that makes sense...
osu replied on at Permalink Reply
Considering it's the same owner for all directories in my site, and things like adding files from an 'incoming' directory is working, I doubt it's a permissions problem...

I guess it's something to do with my set up - couldn't be page type template or theme related could it?
10010110 replied on at Permalink Reply
It may not be related to your issue but I had the same error coming up and learned that error 401 means that the script or whatever isn’t “authorized” do access the file/directory. I remembered that I had an htaccess authorization (i. e. user/password request) and this was what caused the error. Removing the htaccess authorization made it work again.

Just in case anyone runs into this problem, too.
nando replied on at Permalink Reply
When trying to upload multiple files I get an error

This is the php-error.log

PHP Warning: join() [<a href='function.join'>function.join</a>]: Invalid arguments passed in C:\inetpub\wwwroot\concrete\tools\files\add_to_complete.php on line 20

I am using IIS7 on win2008 server...I was having issues with single uploads getting an invalid file message, kind of solved by setting the open_basedir = C:/ instead of C:/netpub/wwwroot
Donatas replied on at Permalink Reply
For me a single file upload worked fine but multiple didn't and it was because of older flash version in my computer.
keeasti replied on at Permalink Reply
I had the same experience ... single worked but multiple didn't.
Turned out that it was the .htaccess password protection I had set up on the development server. Once that is removed, it worked fine with 0755 permissions (although you may need to g 0777 depending on your host)
Hope this helps someone!
cmscss replied on at Permalink Reply
@keeasti, thanks heaps for posting your update - we we're struggling to figure this out and also use apache's .htpasswd to secure our dev sites.

For anyone have issues with this, we've been adding an exception for the /files directory in the virtual host file - I think you can do a similar thing in the .htaccess file but can't use the <Directory> derective - see here:http://perishablepress.com/enable-file-or-directory-access-to-your-...

<VirtualHost *:80>
  ServerName website.dev.com
  DocumentRoot /home/website/public_html/
  # Send to dev server's robots file which contains disallow all
  Alias /robots.txt /home/robots.txt
  <Directory /home/website/public_html/ >
    Options -Indexes FollowSymLinks MultiViews
    AllowOverride All
    Order allow,deny
    Allow from all
    # Requires the .htpasswd file created during dev server setup
    # -----------------------------------------------------------------------
    AuthUserFile /home/.htpasswd
    AuthType Basic

Make sure to reload apache after adding it to update the virtual host files.


cmscss replied on at Permalink Reply
Hmmm... weirdly, this doesn't work in Safari or Firefox which still block access to the /files url.

No biggie as we'll just use Webkit for the dev server.