Bench update error: pymysql.err.DataError: (1406, u"Data too long for column 'content' at row 1")


I am updating from
ERPNext: v12.1.8 (version-12)
Frappe Framework: v12.0.18 (version-12)

and I get pymysql.err.DataError: (1406, u"Data too long for column ‘content’ at row 1")

I have searched and it seems that there may be some spanish characters that mysql doesn’t like when I do bench update or bench update --reset or bench update --patch.

  • Does anyone know in which Doctype exists a column named ‘content’ so that before I update I may remove the special characters so that mysql doesn’t complain?

frappe@ERPNext:~/frappe-bench$ bench update --patch
Backing up sites…
Patching sites…
Updating DocTypes for frappe : [========================================]
Updating DocTypes for erpnext : [========================================]
Updating customizations for Address
Traceback (most recent call last):
File “/usr/lib/python2.7/”, line 174, in _run_module_as_main
main”, fname, loader, pkg_name)
File “/usr/lib/python2.7/”, line 72, in _run_code
exec code in run_globals
File “/home/frappe/frappe-bench/apps/frappe/frappe/utils/”, line 97, in
File “/home/frappe/frappe-bench/apps/frappe/frappe/utils/”, line 18, in main
File “/home/frappe/frappe-bench/env/local/lib/python2.7/site-packages/click/”, line 764, in call
return self.main(*args, **kwargs)
File “/home/frappe/frappe-bench/env/local/lib/python2.7/site-packages/click/”, line 717, in main
rv = self.invoke(ctx)
File “/home/frappe/frappe-bench/env/local/lib/python2.7/site-packages/click/”, line 1137, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File “/home/frappe/frappe-bench/env/local/lib/python2.7/site-packages/click/”, line 1137, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File “/home/frappe/frappe-bench/env/local/lib/python2.7/site-packages/click/”, line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File “/home/frappe/frappe-bench/env/local/lib/python2.7/site-packages/click/”, line 555, in invoke
return callback(*args, **kwargs)
File “/home/frappe/frappe-bench/env/local/lib/python2.7/site-packages/click/”, line 17, in new_func
return f(get_current_context(), *args, **kwargs)
File “/home/frappe/frappe-bench/apps/frappe/frappe/commands/”, line 25, in _func
ret = f(frappe._dict(ctx.obj), *args, **kwargs)
File “/home/frappe/frappe-bench/apps/frappe/frappe/commands/”, line 233, in migrate
migrate(context.verbose, rebuild_website=rebuild_website, skip_failing=skip_failing)
File “/home/frappe/frappe-bench/apps/frappe/frappe/”, line 62, in migrate
File “/home/frappe/frappe-bench/apps/frappe/frappe/utils/”, line 276, in update_global_search_for_all_web_pages
File “/home/frappe/frappe-bench/apps/frappe/frappe/utils/”, line 359, in sync_global_search
File “/home/frappe/frappe-bench/apps/frappe/frappe/utils/”, line 394, in sync_value
}, value)
File “/home/frappe/frappe-bench/apps/frappe/frappe/database/”, line 932, in multisql
return self.sql(query, values, **kwargs)
File “/home/frappe/frappe-bench/apps/frappe/frappe/database/”, line 156, in sql
self._cursor.execute(query, values)
File “/home/frappe/frappe-bench/env/local/lib/python2.7/site-packages/pymysql/”, line 170, in execute
result = self._query(query)
File “/home/frappe/frappe-bench/env/local/lib/python2.7/site-packages/pymysql/”, line 328, in _query
File “/home/frappe/frappe-bench/env/local/lib/python2.7/site-packages/pymysql/”, line 516, in query
self._affected_rows = self._read_query_result(unbuffered=unbuffered)
File “/home/frappe/frappe-bench/env/local/lib/python2.7/site-packages/pymysql/”, line 727, in _read_query_result
File “/home/frappe/frappe-bench/env/local/lib/python2.7/site-packages/pymysql/”, line 1066, in read
first_packet = self.connection._read_packet()
File “/home/frappe/frappe-bench/env/local/lib/python2.7/site-packages/pymysql/”, line 683, in _read_packet
File “/home/frappe/frappe-bench/env/local/lib/python2.7/site-packages/pymysql/”, line 220, in check_error
File “/home/frappe/frappe-bench/env/local/lib/python2.7/site-packages/pymysql/”, line 109, in raise_mysql_exception
raise errorclass(errno, errval)
pymysql.err.DataError: (1406, u"Data too long for column ‘content’ at row 1")


  • Anyone?

To know which table it is, you can go with:

            ,TABLE_NAME AS  'TableName'
WHERE       COLUMN_NAME = 'content'
ORDER BY    TableName

Write it exactly like that, but it will return about 19 rows

Thanks, mel_erp! I’ll try it and let you know.

any updates?

From this query:


I got this list:

| ColumnName | TableName |
| content | help |
| content | tabActivity Log |
| content | tabArticle |
| content | tabBlog Post |
| content | tabChat Message |
| content | tabComment |
| content | tabCommunication |
| content | tabContent Activity |
| content | tabCourse Activity |
| content | tabCourse Content |
| content | tabHelp Article |
| content | tabHomepage Section Card |
| content | tabLetter Head |
| content | tabNote |
| content | tabPost |
| content | tabPost Comment |
| content | tabTopic Content |
| content | __global_search |
18 rows in set (0.06 sec)

I will update if I manage to find the error, because I won’t be able to upgrade if this doesn’t get solved.

Same problem here. I managed to workaround this.

The problem is that the sync_value in frappe/frappe/utils/ try to insert a very large value in the content column from __global_search table. The content column has a TEXT datatype, which support up to 65,535 characters. For some reason the data to be inserted is larger than that.

In order to workaround this, you should edit the sync_value function. Here is what I did:

def sync_value(value):
        Sync a given document to global search
        :param value: dict of { doctype, name, content, published, title, route }
        if len(value["content"]) > 65000:

That is, if the content is larger than 65.000, then do not insert that value.

1 Like

Hi Roque.

I have been waiting for this for over a year now. You have no idea how much you helped me.

Thank you!

Nice to know man! :+1:

Is there a PR on GitHub to fix for this ?

Finally got the bench update working, but instead of adding the code provided by @roquegv (which gave me some trouble because of some of my customizations), I decided to change the __global_search table’s content column from data type TEXT to MEDIUMTEXT, which accepts up to 4Gb of data, and changed it back again to TEXT after the update.

I hope this helps someone else