Friday, November 24, 2017
   
Text Size

MEDIA COUNCIL OF TANZANIA WEBSITE

CoRI presents requested proposals on On-Line and Broadcast Content regulations

cori3oct

The government recently had requested on short notice media stakeholder to provide it their views and inputs on the amendment on the Draft Regulations of the Electronic and Postal Communication Online and Broadcast.

Here are the proposals by CoRI presented to the Government for consideration.

Submission of Collected Views on the Electronic and Postal Communications (Online Content) Regulations, 2017

Coalition on the Right to Information

No.

Current Regulation

Proposed Change

Reason

PART I – PRELIMINARY PROVISIONS

3

“indecent material” means material which is offensive, morally improper and against current standards of current behaviour which includes nudity and sex

“indecent material” means material which contains explicit and gratuitous nudity and sex

This definition seems to specifically target issues around sex but the previous wording was too broad and vague, leaving it open to misinterpretation

 

“online” means a networked environment available via online whereby content is accessible to or by the public whether for a fee or otherwise and which is intended for consumption in or originated from Tanzania

“online” means by means of the internet or other computer network

These are globally recognised terms and as such have fixed definitions. If there is a desire to specify geography, this can be added or implied from the context of the law.

 

-

Add definition: “content committee” is the committee responsible for electronic (online) content under the auspices of the Tanzania Communications Regulatory Authority

Currently in the regulations there is no process by which to determine whether content falls under the categories of prohibited content listed under Regulation 12. Anyone who assesses content as prohibited is allowed to issue a take-down notice which all parties are then required to comply with. It is essential that a qualified and competent body determine whether content is prohibited or not and be vested with the Authority to issue take-down notices. Otherwise anyone with a grudge, improper interpretation of these Regulations or malicious intent can issue take-down notices on any content.

 

PART III – GENERAL OBLIGATIONS FOR ONLINE CONTENT

5(1)(f)

Have in place mechanisms to identify source of content

Remove

Online content providers would include any corporate or institutional website, any user of social media as well as blogs, forums, and online media. This is an extremely wide and diverse angle of content providers and as such it will not always be feasible to identify sources of all content.

5(1)(g)

Take corrective measures for objectionable or prohibited content; and

Take corrective measures for prohibited content; and

Objectionable content is too wide and undefined. Who has to object for content to be classified as objectionable? The term is also not defined in the on the interpretation section of regulations while prohibited content is.

5(1)(h)

Ensure prohibited content is removed within 12 hours upon being notified

Ensure prohibited content is removed within 48 hours upon being notified by the Content Committee of the Authority

12 hours is an extremely short time to allow someone to comply with an order. It does not give them any opportunity to dispute the notice and may cause them to lose their access to the internet if they happen not to see the take-down notice in time. In addition the sub-regulation (h) does not specify who is responsible for issuing these take-down notifications.

5(3)

An online content provider shall co-operate with law enforcement officers in ensuring functions under these regulations.

An online content provider shall co-operate with law enforcement officers in ensuring functions under these regulations. Where an online content provider takes issue with the requirements made by law enforcement officers, the dispute must be resolved in a court of law.

If an online content provider sees malintent or over-stepping in the requests of law enforcement officers, it is essential that they have recourse to object of what is being asked of them. It is possible that some law enforcement requests may contravene the right to privacy or even protection of whistleblowers so a competent authority must be allowed to make a final judgement.

6(1)(b)

Remove prohibited content provided such removal is carried out in accordance with these Regulations

Remove

The responsibility for complying with take-down notices must rest with one party – the user who posted the content. Otherwise responsibilities can be diffused. Application service licensees should not have the authority to interfere with individuals’ content, this grants them authority that should not be available to private companies.

6(3)

Once the licensee is notified by the Authority or by the person affected by the content of existence of prohibited content, it shall, within 12 hours, from the time of notification, inform its subscriber to remove the prohibited content.

Once the licensee is notified by the Content Committee of the existence of prohibited content, it shall, within 48 hours, from the time of notification, inform its subscriber to remove the prohibited content. Evidence of the take-down notice from the Content Committee should be provided to the subscriber.

12 hours is an extremely short time to allow someone to comply with an order. It does not give them any opportunity to dispute the notice and may cause them to lose their access to the internet if they happen not to see the take-down notice in time. In addition, in its current formulation, take-down notices can be issued by anyone who feels that they object to a person’s content whereas this authority must be vested in a competent and qualified body.

6(4)

Upon receipt of notification pursuant to Sub regulation 3, the subscriber shall, within 12 hours from the time of notification, remove the prohibited content.

Upon receipt of notification pursuant to Sub regulation 3, the subscriber shall, within 48 hours from the time of notification, remove the prohibited content.

12 hours is an extremely short time to allow someone to comply with an order. It does not give them any opportunity to dispute the notice and may cause them to lose their access to the internet if they happen not to see the take-down notice in time.

6(5)

Where the subscriber fails to remove the prohibited content within 12 hours, the licensee shall suspend or terminate the subscriber’s access account.

Where the subscriber fails to remove the prohibited content within 48 hours, the licensee shall suspend or terminate the subscriber’s access account, pending determination of any matter before a court of law.

If the user objects to the take-down notice and is filing proceedings in this regard, their internet access cannot be suspended until any disputes before the court are resolved.

7(1)

Subject to Regulation 5 every blogger and online forum shall-

Subject to Regulation 5 every blogger and online forum that is run for commercial purposes shall-

The power and potential of the internet rests in freeing the means of communication and making them accessible to everyone. Registration is likely to involve cost, whether in time or financially for a registration fee. This burden should not be imposed on blogs and forums unless they are commercially run.

7(1)(b)

Ensure that, where his blog or forum allows the general public to post content, he sets mechanism that content is not published prior to the blogger’s review

Ensure that, where his blog or forum allows the general public to post content, he removes content after receiving notice to do so from the Content Committee

Pre-moderation of all comments is almost impossible. For the platforms that receive hundreds of comments a day this is an impossible investment of time and human resources. In addition a number of global platforms used in Tanzania, such as Facebook and Instagram, do not allow this function. In the current formulation there would be no users of Facebook or Instagram in Tanzania. In addition, individual bloggers and platform managers are not competent or qualified to assess content to see whether or not it is prohibited. This must be done by a competent and qualified authority.

7(1)(c)

Use moderating tools to filter content and set mechanism to identify the sources of such content

Remove

It is critical that content is moderated or taken down by a competent and qualified authority. Allowing private individuals or companies who run blogs and forums the authority to remove content gives them too much power. They should only be allowed to enact the orders of a body that can appropriately make decisions on these matters. In particular, the list of prohibited content is quite broadly defined in places and so open to different interpretations. In addition, anonymity is protected by the Whistleblowers and Witness Protection Act, Article 4(3). For blog and forum owners to acquire additional information about those who post on these platforms is also an invasion of their right to privacy.

7(2)

Sub Regulation 1 shall apply to Tanzania residents, Tanzania citizens outside the country, non-citizens of Tanzania residing in the country, blogging or running online forums with contents for consumption by Tanzanians.

Sub Regulation 1 shall apply to Tanzania residents, Tanzania citizens running commercial blogs and forums outside the country, non-citizens of Tanzania residing in the country, blogging or running online forums with contents for consumption by Tanzanians.

It will be almost impossible to enforce the registration requirement for those outside the country. In addition registration should only be required for commercially run blogs and forums.

8(b)

Upon notification by the person affected by the content, the Authority, or law enforcement agency, remove the hosted content

Remove

For increased accountability and a lower enforcement burden only the user (individual, group or legal entity) who owns and posted the content should be able to take that content down. Online content hosts should not have the authority to interfere with individuals’ content, this grants them authority that should not be available to private companies.

9(c)

Put in place mechanism to filter access to prohibited content

If a user is seen in an internet café accessing prohibited content, he/she should be asked to leave the premises

To require internet cafes to put in place filtering that prevents access to prohibited content is onerous and impractical. The list of prohibited content in Regulation 12 is wide and open to differing interpretations. Private companies should not have the authority to determine whether content is prohibited or not. It also creates an unnecessary burden on these often small businesses; many people currently access the internet through phones or private computers. So if specific websites are blocked (filtered) by internet cafes it will lose them business while determined users find other means to access the prohibited content.

9(d)

Install surveillance camera to record and archive activities inside the cafe

Put in place a registration form such that all customers are required to provide their names and contact details.

Most internet cafes in the country are small businesses consisting of a few computers. The cost burden of installing surveillance cameras and archiving these for an unspecified period of time will force many of these businesses to close, depriving people of a source of income and jobs. In addition, these internet cafes provide low cost access to the internet for millions of Tanzanians who cannot afford computers and smartphones. Placing these types of restrictions on them and forcing the closure of many will deprive many of these users from the world’s largest knowledge bank. Putting in place a low cost solution will serve the same purpose with much less of a burden on these small businesses which are often run by young people.

10(b)

Use password to protect any user equipment or access equipment or hardware to prevent unauthorized access or use by unintended persons.

Remove

In some cases, passwords can be detrimental. In an emergency if someone is trying to contact the relative or friend of somebody who has been in an accident or been hurt, a password on a phone could prevent them from doing so. The use, or not, of passwords should be left to the individual to determine depending on their own circumstances.

11(2)

Notwithstanding Sub Regulation 1 or other provisions of these Regulations, any authorized person who executes a directive or assists with execution of such directive and obtains knowledge of any information shall not-

Notwithstanding Sub Regulation 1 or other provisions of these Regulations, any authorized person who executes a directive or assists with execution of such directive and obtains knowledge of any information shall-

There is an error in including ‘not’ here because it only applies to 11(2)(a) not 11(2)(b).

11(2)(a)

Disclose such information to another person unless that other person is a law enforcement officer and to the extent that such disclosure is necessary for the proper performance of the official duties of the authorized person or the law enforcement officer receiving the disclosure; or

Not disclose such information to another person unless that other person is a law enforcement officer and to the extent that such disclosure is necessary for the proper performance of the official duties of the authorized person or the law enforcement officer receiving the disclosure; or

This is part of the same error mentioned above

12(1)(a)

Indecent content save for sex and nudity sex scenes approved by the body responsible for film censorship;

Indecent material save for sex and nudity sex scenes approved by the body responsible for film censorship;

The definition for indecent content provided in Regulation 12 matches exactly the definition of indecent material in the Preliminary Provisions so there is no need to introduce a new term here.

12(1)(b)

Obscene content

Remove

There are so many subjective terms in the definition of obscene content, it leaves so much room for misinterpretation and differing interpretation. For users to be able to comply with these regulations and know what they can and cannot post online these clauses need to be clear and specific. The spirit of the definition of obscene content is covered by articles (a), (e), (f).

12(1)(d)

Explicit sex acts or pornography

Remove

Covered in the slightly amended definition of (e) below

12(1)(e)

Sex crimes, rape or attempted rape and statutory rape, or bestiality

Sex crimes, rape or attempted rape and statutory rape, bestiality or pornography

12(1)(d) and 12(1)(e) cover similar issues and so are best combined. These issues are also covered in 12(1)(a) so there is no need for constant repetition.

12(1)(f)

Content that portrays violence, whether physical, verbal or psychological, that can upset, alarm and offend viewers and cause undue fear among the audience or encourage imitation

Content that intentionally promotes violence, whether physical, verbal or psychological, that can upset, alarm and offend viewers and cause undue fear among the audience or encourage imitation

In some cases violence is part of real life and there can be valid reasons for portraying it. For example a campaign designed to reduce domestic violence may contain graphic images. In laws regulating content intention is important to consider.

12(1)(g)

Content that portrays sadistic practices and torture, explicit and excessive imageries of injury and aggression, and of blood or scenes of executions or of people clearly being killed

Content that intentionally promotes sadistic practices and torture, explicit and excessive imageries of injury and aggression, and of blood or scenes of executions or of people clearly being killed

Without changing portrays to promotes, some of this sub Regulation could be applied to news content.

12(1)(h)

Content that causes annoyance, threatens harm or evil, encourages or incites crime, or leads to public disorder

Content that encourages or incites crime

Annoying someone is subjective and should not be punishable by a prison term. Threatening harm or evil is a crime as is public disorder therefore the sub Regulation is much more clear to interpret this way

12(1)(j)(v)

Any other content related to the above

Remove

It is important for users of online content to understand exactly what is and is not prohibited content. Specificity is critical in legal documents.

12(1)(k)

Content that uses bad language including but not limited to:-

i) the use of disparaging or abusive words which is calculated to offend an individual or group of persons

ii) crude references words, in any language commonly used in the United Republic, which are considered obscene or profane including crude references to sexual intercourse and sexual organs

iii) hate speech

Remove

Offending someone should not be a criminal offense. This clause essentially criminalises opinion. In addition the vague terms incorporated into this sub Regulation make it almost impossible to interpret for both citizens and law enforcement authorities. This can result in substantial wastage of state resources in policing unclear content issues online. In addition, hate speech is covered in 12(1)(c)

12(1)(l)

False content which is likely to mislead or deceive the public except where it is clearly pre-stated that the content is:-

Deliberately false malicious content which is intended to deceive the public except where it is clearly stated that the content is:-

False content is a vague and subjective term. A number of cases brought under the Cybercrimes Act (2015) under Article 16 have been thrown out due to the difficulties in prosecuting using this type of terminology. Showing bad intention will help to close this gap. In addition it will not always be practical for satire, parody or fiction to pre-state what it is, as long as it is stated somewhere there should be no issue

12(1)(l)(ii)

Fiction; and

Fiction; or

These conditions cannot all exist at the same time

12 (2)

“indecent content” means content which is offensive, morally improper and against current standards of accepted behaviour, including nudity and sex;

“obscene content” emans content which gives rise to a feeling of disgust by reason of its lewd portrayal and is essentially offensive to one’s prevailing notion of decency and modesty, with a possibility of having a negative influence and corrupting the mind of those easily influenced;

Remove definitions of “indecent content” and “obscene contet”

Covered in preliminary provisions or unnecessary

13 (a)

Online content provider shall ensure that-

(a) children do not register, access or contribute to prohibited content

Online content provider shall aim to ensure that-

(a) children do not register, access or contribute to prohibited content

These entities can put in place protective measure to try to ensure children do not access prohibited content such as pop ups asking users to confirm their age. However they cannot guarantee that child users will not lie and as such should not be criminally liable for the same. Children’s online safety is much more the responsibility of their parents and/or guardians than online content providers.

13(b)

Users are provided with content filtering mechanisms and parental control

Users are provided with parental control

We should ensure children are protected but not specify exactly how as this may place an onerous burden on content providers. Filtering mechanisms are generally hard to make available to individual users in a tailored way.

 

PART IV - COMPLAINTS HANDLING

14(1)

Any person may file a complaint to the online content provider against parties referred in Regulation 2 in relation to any matter connected with prohibited content

Any person may file a complaint to the Content Committee against parties referred in Regulation 2 in relation to any matter connected with prohibited content

As in previous explanations, it is essential that a competent and qualified authority be tasked with making determinations about prohibited content rather than just any individual. Empowering individuals and legal entities to issue take-down notices is subject to grave misuse.

14(2)

Online content provider shall, within 12 hours, resolve the complaint filed under this Regulation.

The Content Committee shall, within 5 working days, meet to resolve the complaint filed under this Regulation

Again this creates a coherent process for any accusations of prohibited content. Since the Content Committee consists of a group of individuals, they will need more than 12 hours to meet and deliberate the issue

14(3)

Where the online content provider fails to resolve complaint under this regulation, the aggrieved person may, within thirty days refer the complaint to the Authority.

Where the Content Committee fails to resolve the complaint under this Regulation, or where either party is unsatisfied with the resolution, they may, within 10 working days, refer the complaint to the Minister

As with all legal issues subject to interpretation and dispute, a proper process must be created and followed.

14(4)

Added

Add clause: Where the Minister fails to resolve the complaint under this Regulation, or where either party is unsatisfied with the resolution, they may, within 10 working days, refer the complaint to a court of law

The final arbitrer of these disputes should always be a court of law in order to comply with appropriate process and guarantee the rights of both parties

15(1)

Upon receiving the complaint under this regulation, the Authority shall serve the online content provider with copy of the complaint and require the online content provider to reply within 12 hours.

Upon receiving a complaint under this regulation that is not related to prohibited content, the Authority will make a decision on the same.

It is important to have a complaints process for both prohibited content, addressed in Regulation 14 and for issues not related to prohibited content addressed here.

15(2)

Where a person is not satisfied with the response of the content provider in Sub regulation 1, the Authority may considerand deal with the complaint through Content Committee procedures.

Where a person or legal entity is not satisfied with the response of the Authority, they may, within 30 days, refer the issue to a court of law

It is important to safeguard due process and allow for expressions of dissatisfaction with the decisions of the Authority

 

PART V – MISCELLANEOUS PROVISIONS

16

Any person who contravenes the provisions of these Regulations, commits and offence and shall, upon conviction be liable to a fine of not less than five million Tanzania shillings or to imprisonment for a term of not less than twelve (12) months

Any person who contravenes the provisions of these Regulations, commits and offence and shall, upon conviction be liable to a fine of not more than five million Tanzania shillings or to imprisonment for a term of not more than twelve (12) months

While the Regulations contained herein are a laudable effort to regulate the content available online and to ensure that all business involved in the provision of online services comply with some minimum standards, the offences contained are not criminal and neither do they warrant such harsh punishments. In addition there should be a difference in how individual users are treated in comparison with commercial entities. Placing a maximum fine and sentence allows courts the appropriate discretion while not creating incentives for over-censorship.

cori3oct2

CORI’s input on Electronic and Postal Communications (Broadcast Content) Regulations, 2017

 Introduction

The Ministry of Information, Culture, Arts and Sports recently published draft Broadcast Content Regulations (2017), under the Electronic and Postal Communications Act. This briefing note presents an analysis of the regulations, including but not limited to their implications for freedom of expression.Suggested amendments that would mitigate the concerns raised are put forward.

 Analysis

 Vague and broad terms and freedom of expression

There are numerous articles in the proposed regulations that limit freedom of expression in some way. Much of this is reasonable – for example to protect public health, to ensure that sponsored content is clearly labelled as such, or to ensure balance in news reporting and fair coverage of competing candidates during election campaigns. However, in several cases, the wording of these terms is particularly broad and open to misinterpretation, and therefore vulnerable to abuse as a means of closing down space for legitimate expression of views. For example:

 

  • 5(h): a requirement for commercial broadcasters to “provide programmes that promote national peace, unity and tranquillity and that does not endanger national security”.
  • 6(2)(b): a requirement that religious content service providers must not “offer programmes that feature the views or beliefs of any race or religion which are unacceptable to the target audience”.
  • 10(1)(a): a requirement on every broadcaster to ensure that their content “upholds national sovereignty, national unity, national interest, national security and Tanzania’s economic interests”, among other things.
  • 10(1)(g): a prohibition on broadcasting anything that has the potential to influence the minds of viewers / listeners without their being aware or fully aware of what has occurred.
  • 10(1)(h): “Avoid programme related to astrology, superstition or broadcast material related to traditional healer purporting to cure ailments or diseases”
  • 10(2)(a): a prohibition on broadcasting “any matter which contains the use of offensive language, including profanity and blasphemy”
  • 10(2)(e): a prohibition on content that is “indecent, obscene, false, menacing or otherwise offensive in character”
  • 16(5): “Every licensee shall refrain from broadcasting programmes that are likely to promote civil or public disorder”

 

The restrictions on content that might endanger national security are reasonable and clear, and of course it is entirely justifiable to try to protect religious minorities and to prevent civil disorder. The difficulty, however, is that many of the terms used above are not defined, vague and/or open to abuse. Protection of “unity and tranquillity”, for example, is language that could be used as a justification for closing down space for legitimate criticism and debate. A similar argument applies to “upholding national unity” and “economic interests”. And in 16(5), “likely to” could easily be replaced by “intended to”, or legitimate coverage of controversial news events could be prevented.

 

 

Specific concern

Suggested changes / proposed new text

5(h) “unity and tranquility” are vague terms

Remove these terms, so it reads: “provide programmes that do not endanger national security”.

6(2)(b): restrictions on religious broadcasts that refer to other religions are open to misinterpretation

Assuming the intention is to protect religious groups against abuse, clarify the language accordingly, such as: “must not offer programmes that denigrate the views or beliefs of any other race or religion.”

10(1)(a): the line “upholds national sovereignty, national unity, national interest, national security and Tanzania’s economic interests” is broad and open to misuse

Remove the terms “national unity” and “Tanzania’s economic interests”.

10(1)(g): this prohibition arguably prohibits broadcasts of almost anything. Influencing viewers / listeners minds is complex, and neither viewer nor even content producers are ever fully aware of how this works.

Remove 10(1)(g) or replace the words “has the potential to …” with “is deliberately and maliciously intended to …”.

10(1)(h): prohibits any content “related to astrology, superstition” or “traditional healer[s] purporting to cure ailments or diseases”, even content that questions such matters. As such no coverage of attacks on people with albinism would be permitted.

Replace 10(1)(h) with the following text: “Avoid programmes that represent astrology, superstition or traditional healers purporting to cure diseases as being effective.”

10(2)(a): the term “offensive content” is very broad, unclear and subjective, open to misinterpretation and abuse. Almost any criticism could be considered offensive to someone?

Remove 10(2)(a) and ensure that protections relating to bad language and religious minorities are covered elsewhere.

10(2)(e): appears to make it an offence to broadcast any fictional content, as well as being open to interpretation as to what it considered “menacing” or “offensive”.

Remove 10(2)(e) and ensure that protections relating to sexual content are covered elsewhere.

16(5): the terms “are likely to” are highly subjective and open to interpretation and abuse

Replace “are likely to” with “are intended to”.

 

 

Duplication and inconsistency

Large sections of the draft regulations duplicate each other to a large extent, sometimes covering the same issues three or more times. For example, content of a violent nature is covered at length in article 12, and also in 10(2)(c), 37(7)(a); sexual content is covered in articles 26 and 27, and also in 10(2)(b), 31(1)(b) and 37(7)(c); offensive language in and 10(2)(a), 28(d), 31(1)(b), and 37(7)(d). There is also duplication of terms related to the protection of children from unsuitable content in articles 11 and 13, 5(f) and parts of 12, and the use of English, Kiswahili and other languages in article 28(a) as well as 5(d) 6(1)(f), 7(1)(n-o) and 30(2)(a).

 

Such duplication is entirely unnecessary and likely to result in confusion.

 

Relatedly, there are several examples of inconsistency in the regulations, particularly with regard to the treatment of the public service broadcaster, commercial broadcasters, non-commercial broadcasters and community broadcasters. For example, in article 5, commercial broadcasters are required to “include drama, documentaries and children’s programmes that reflect the themes and cultural identity of the nation”, “avoid racial and religious hatred”, “avoid programs related to nakedness, gambling, violence, superstition and astrology”, “avoid defamation and blasphemy” and “provide programmes that promote national peace, unity and tranquillity”, while none of these requirements apply to public broadcasters, non-commercial broadcasters or community broadcasters. Similar, each of these other types of broadcasters are subject to numerous requirements that do not apply to the others.

 

Such inconsistency lacks any clear logic. If it makes sense to put such restrictions on one type of broadcaster, then (perhaps with a few exceptions), it also makes sense to apply them to all other types of broadcasters.

 

Further, other inconsistencies in the proposed regulations directly contradict each other. For example, regulation 14(3)(i) requires that broadcasters “shall not broadcast any programme sponsored by a political party” and 18(1)(d) says that broadcasters must “not permit any broadcast sponsored by or made on behalf of a political party” during election campaigns. However, 18(1)(c) explains how content sponsored by political parties should be introduced.

 

 

Specific concern

Suggested changes / proposed new text

Extensive duplication, particularly with regard to violent content, sexual content, offensive language, use of English and Kiswahili, and protection of children.

Conduct a thorough review of all articles and clauses that relate to each of the highlighted matters, and streamline them.

Inconsistency towards different types of broadcasters

Review articles 4-7 to identify which terms need to apply solely to a particular type of broadcaster and should remain, and which should apply to all types of broadcasters and should therefore be incorporated into articles 10-18 or 31-41 as appropriate.

Inconsistency toward political parties’ sponsorship of content

Remove 18(1)(c)

 

Onerous requirements for broadcasters, including local content requirements

The regulations introduce a range of requirements for broadcasters that would create a large workload and considerable cost if carried out in full.

 

In 16(1), the regulations require that any programmes that allow the expression of personal views, “the audience shall be informed in advance and be given an opportunity to respond to such views.” Does this mean that all public debate programmes must include a phone-in element? If so, the burden on broadcasters would be excessive.

 

Further, 21(3), in providing for access to content by the deaf or blind, requires broadcasters to ensure that all content is subtitled, audio-described and translated into sign language. The intention to ensure people with disabilities are included is positive, but requiring this of all content would impose huge financial burden on broadcasters.

 

Article 30 covers local content, requiring that “a minimum of 60% of all content” must be local content and “not less than 80%” of music broadcast is Tanzanian music. Similarly, article 8(1)(b) requires subscription broadcasters to “provide 25% local content by subscription of total channels”. While again the intention here is positive, the reality is that these minimums will be very difficult or impossible for many broadcasters to achieve without losing viewers / listeners and while maintaining commercial viability.

 

Article 38 covers programme schedules, requiring that broadcasters must “publish [a] programme schedule in a daily newspaper … at least one month in advance”, “adhere to the programme schedule … unless otherwise obliged to broadcast spontaneous events of national or international significance”, “not [to] change [the] programme schedule without prior notification to [TCRA]” and submit to TCRA a“quarterly programme schedule fourteen days before each quarter.” These terms would severely limit the ability of broadcasters to adapt quickly to changing circumstances and new opportunities.

 

Finally, there are specific restrictions on non-commercial and community broadcasters that would limit their ability to raise funds to cover their operating costs. Specifically, 6(1)(d) prohibits non-commercial broadcasters from airing “advertisements that put them on the same status as commercial broadcasters. Allowable advertisements include announcements and underwritings”. Further, the definition of community broadcasters in article 3 specifies that they must be “not for profit”, and 7(1)(l) requires community broadcasters to “avoid advertising any product or services which are detrimental to the community’s well-being and health”.

 

 

Specific concern

Suggested changes / proposed new text

Requirement for all opinion content to be accompanies by public phone-ins or similar

Add “including submitting view by email or post.”

High burden on broadcasters in requiring all content to be accompanied by subtitles, audio description and sign-language translation

Relax these terms such that it should apply to specified content (such as news, or peak-time content only), or to a specified minimum amount of content (such as 10%).

High local content requirements

Reduce the minimum local content requirements. Suggested minimums are 20% for “all content”, 40% for music and 10% of subscription broadcasters’ channels.

Restrictive rules against changing programme schedules

Require broadcasters to publish schedules as per the proposed regulations, but allow freedom to change schedules at short notice, requiring only that broadcasters much provide information to TCRA of any changes made within 14 days after broadcast.

Restrictions on advertising for non-commercial and community broadcasters

Allow all broadcasters to take advertisements, requiring only that non-commercial broadcasters must re-invest all income in content and operating costs.

 

 Morality and related judgements

In several places, the regulations refer to public morality and related issues of judgement, usually without providing specific details. 10(1)(c) and (d), for example, requires all broadcast content to “observe good taste and decency” and to “uphold public morality”. 16(6) requires every broadcaster to “refrain from broadcasting programme that are likely to promote prostitution and other immoral activities.”

 This raises two concerns. First, who should be the moral arbiters who determine what is and what is not acceptable? And second, the lack of specifics introduces uncertainty and the opportunity for misinterpretation and misuse. The phrase “other moral activities”, for example, could be argued by some to include polygamy, while others might argue that this is acceptable behaviour. Corporal punishment might similarly divide people.

  Specific concern

 

Suggested changes / proposed new text

Vague terms governing content that could offend public morals

If specific moral concerns are to be prohibited, they should be specified in full.

 

 

Add comment


Security code
Refresh

Journalism Awards

awaa

OUR RSS

  • No feeds found!