Frequently Asked Questions (FAQs)
What is BULATS?
BULATS stands for "Business Language Testing Service." There are 3 tests: a Computer-Adaptive Test which assesses listening, reading and language knowledge. The separate Writing and Speaking Tests assess written and oral language skills.
BULATS tests are also available in 4 languages: English, French, German and Spanish.
What is a Computer-Adaptive test?
A computer-adaptive test is smart: it selects questions offers during the actual test based upon to how well you have answered the previous question—not on some preconceived order.
The next test question is only selected after the computer grades your answer on the current question. Answer the question correctly and the computer will choose your next question at a slightly more difficult level. On the other hand, if you have given a wrong answer, the test will drop a level and keep dropping until you can begin to answer questions correctly.
In other words, test questions become progressively easier or harder until the system has a reliable assessment of the candidate's level.
For example, if your English level is around the A1 level, after the first few questions, you will be essentially be taking an A1 test. One the other hand, if an English native speaker were to sit the test, the questions would rapidly get harder to test at the advanced “C2” level.
The BULATS Computer-Adaptive test doesn’t care how many questions a candidate gets right or gets wrong. Like a blood or an eye test, it’s diagnostic in nature and it simply focuses on the level of any individual candidate.
A Computer-Adaptive test can typically be shortened by 50% and still maintain a higher level of precision than a standard fixed version. Test-takers do not waste their time attempting items that are too hard or trivially easy. Additionally, the testing organization benefits from the time savings; the cost of examinee seat time is substantially reduced.
A typical BULATS Computer-Adaptive test is some 45 to 50 question. As the test chooses from a test bank of over 10,000 questions and listening tasks, each BULATS test is virtually unique.
The “adaptive” feature of the BULATS Computer-Adaptive test makes it the most accurate English language proficiency test available today.
How many questions are in the Computer-Adaptive test?
A typical BULATS Computer-Adaptive test is some 45 to 50 questions yet the tests chooses from its test data bank of over 10,000 questions and listening tasks.
This ability to select questions from all levels—from beginner to advanced levels—means the BULATS CBT can be taken by anyone to determine their individual English language proficiency.
It also means that no two BULATS tests are identical.
How accurate is BULATS?
There are two key exam qualities: validity and reliability
Validity relates to the usefulness of a test for a purpose: does it enable well-founded inferences about candidates' ability? Can performance in the test be interpreted in terms of ability to perform in the real world?
BULATS uses authentic workplace situations to test a candidate’s ability to use language in real business situations.
BULATS tests how well you can use language, not how much you know about a language.
Reliability relates to the accuracy of the measurement of the exam: does it rank-order candidates similarly in repeated uses? Can we expect a candidate to achieve the same score in two versions of the same test or in the Computer-based and the Standard tests?
The reliability of the overall test is 0.94, which is very high. Rasch reliability is used for calculating reliability.
The BULATS Computer-based test is adaptive and is supported by a large bank of encrypted, secure tasks. The more than 10,000 questions and listening tasks mean that each BULATS Computer test is unique and different candidates will not be presented with the same items.
An algorithm chooses items as the test progresses according to how the candidate performed on previous items.
It allows candidates to face items at an appropriate level of difficulty and provides more accurate assessment of candidate ability than a non-adaptive test with a similar number of items.
What accents are used in the listening test?
A variety of native English accents including British English and American English. Proficient non-native speakers are also occasionally used.
What languages are BULATS available in and how is the overall BULATS score calculated?
The BULATS tests are available in English, French, Spanish and German. The overall BULATS score is not simply an average of the two section scores. The program uses encrypted look-up tables to calculate the overall score as there is different ability weighting attached to the two sections. If a result in the Listening section is 50 for example, and the score for the Reading and Language Knowledge section is 60, it will not necessarily follow that the overall result will be 55.
How useful is exam preparation for improving students’ language ability?
BULATS tests reading and listening skills that are required for most purposes – not only in business. So exam preparation is valuable, even for someone not taking the exam. However, explanations and examples are included here, for candidates who haven’t had specific exam preparation
How do BULATS tests link to other language tests?
The relationship between BULATS and other language tests
BULATS links to the Cambridge ESOL framework of language levels which is recognised around the world.
Cambridge ESOL is a member of ALTE - the Association of Language Testers in Europe - which has eighteen institutional members testing their own languages as a foreign language. Currently fifteen languages are represented.
As part of the Cambridge ESOL system, BULATS links to the ALTE framework of five proficiency levels and the Common European Framework Levels. These levels are based on the work of the Council of Europe and provide an international and multi-lingual basis for comparison of language proficiency in different languages. An attached document illustrates how other language examinations link to the same ALTE Framework.
How these links are established
- the ALTE Framework which establishes what represents each level in each language
- a common set of statements of ability which are being validated against tests for each language
- standardized test specifications across the different languages
For the Standard BULATS Tests in all languages, each test is linked to the ALTE Framework through the use of 'anchor tests' and statistical analysis of the results of trialling. These anchor tests are used to measure the difficulty of each question in relation to a fixed scale which has been established through extensive research on more than 1000 candidates, of a wide range of nationalities, at each level of proficiency.
For the Speaking and Writing Tests, levels are established by matching them to standards of performance indicated in an extensive databank of performance built up over many years of running tests of speaking and writing.
Why shouldn't companies just use the tests used by local language schools?
For the Speaking and Writing Tests, levels are established by matching them to standards of performance indicated in an extensive databank of performance built up over many years of running tests of speaking and writing.
- Producing reliable and relevant language tests is a specialised skill; using tests produced by language schools/consultants, whose expertise is in training not in producing tests, will mean a lower level of quality - weakness such as questions with more than one answer or questions with no correct answer, questions which depend on world/cultural knowledge rather than language skills, questions which focus on trivial skills rather than key skills, questions where candidates can guess the answers without understanding the text, typing and linguistic error, etc.
- Language schools/consultants are not in a position to do the research necessary to establish a fixed scale of language ability to underlie the test results - this means that there is no real basis for saying with any confidence what level of ability is indicated by particular scores.
- BULATS is based on a fixed scale of language ability and this can be related to leading language examinations in Europe.
- Language schools/consultants are not able to trial their tests as extensively as is done for BULATS. In most cases, language schools/consultants will not trial their test at all.
- Language schools/consultants are not able to measure the exact difficulty of each question in the test and therefore they will be unable to prove the complexity of their examinations to the ALTE/Council of Europe Framework.
- BULATS can provide equivalent tests in English, French, German, and Spanish.
- BULATS is completely independent of the training provided or offered. Companies can have complete confidence that the results provided by BULATS are not influenced by any other interests - such as the need to demonstrate progress in existing courses or the need for further training courses, etc.
Since BULATS is divided into a Computer-Adaptive test, a Speaking test and Writing test, do employees have to take all three tests on the same day?
BULATS offers an organization the flexibility to assess their staff, trainees or applicants in whichever way they like. They can use just one test (e.g. the Computer-Adaptive Test) or they can use all three tests. They can make their staff take all three tests in one day or on three different days. The client organization is able to choose whatever strategy they think most useful - though of course the Agent will advise the client on what options are most likely to meet the client's requirements.
The exception is for the Certificated BULATS service. Because this has been designed as a secure test session as requirement for a third party such as the United Kingdom Border Agency, all three tests are taken in one session.
In the listening test, is it necessary for candidates to understand every word?
No, they just need to pick out the information which is being tested.
Several candidates scored zero in the Listening section of the test. Their scores in the Reading and Language Knowledge test were much higher. How is this possible?
It is rare for candidates to get a zero in their listening test because even by guessing some of the Listening questions, a candidate can at least guess a few correctly. The zero that these employees received was due to the candidate either skipping questions or not completing the test in the 75 mins. allotted time (if set). As the CB BULATS test is an adaptive test, if candidates choose to skip questions or not answer enough questions, the test simply cannot calculate their ability in that skill area.
To avoid this happening you should:
- ensure that all candidates know that they have to answer all the questions put to them during the test.
- ensure that they finish the test prior to the time limit set (if this has been set in the Supervisor mode).
Should I set a time limit for the Computer-Adaptive Test?
Each test will vary in length due to the adaptive nature of the test although tests should ideally be completed within 40 - 75 minutes. However, the test should not take less than 40 minutes.
Why is it important for scores to be reported on a European standard?
More and more companies are working in partnership with companies from other countries - whether this is the result of mergers, takeovers, joint ventures, closer supplier/distributor chain relationships, or other types of relationship. Communication is essential for effective and efficient working practices, and it is vital that the same standards of language skill are used across these international partnerships. What companies need these days is a single system of describing different levels of language ability. And this single system should not just apply for one language, but for a whole range of key languages.
Most companies do not have the time or expertise to establish a framework of standards in language ability within their own company, and they would certainly find it very difficult to link these to standards used by other companies they work with. What is needed is a properly researched, relevant and internationally accepted framework of levels which any company in the world can relate to, whatever their particular needs.
ALTE (the Association of Language Testers in Europe) has been working for many years on establishing such a framework for all European languages. There is no system in the world with the same depth and breadth to its framework. For example, with the ALTE Framework, a multinational company in France can apply the same standards of language skill for workers coming to the French headquarters and who need French, for their staff who work in international business and need English, and for their staff who are frequently communicating with their Argentinian suppliers and so need Spanish. The ALTE framework has now been linked to Council of Europe levels which have a wide currency both in Europe and further afield.
BULATS is the only system in the world that can provide this single system in such a reliable and practical way.
What sorts of questions and listening tasks are used in the test?
All materials are adapted from authentic business materials, such as articles from business magazines, company literature, business correspondence, presentations, discussions, etc. The recordings for the Listening Test are scripted and nearly all have one or two speakers
How reliable are the test results?
An estimate of the test's reliability gives an indication of how far a difference in score on a test is significant and not just the result of chance. The most common estimate of reliability for this type of test is the alpha coefficient, a figure between 0 and 1, where a value close to 1 indicates high reliability. BULATS tests have alpha coefficients over 0.9, which conforms to accepted international standards of reliability for tests of this kind.
Margin of error
For BULATS, one can be confident that a Band 3 is clearly distinct from a Band 5. With contiguous levels, e.g. 3 and 4, there will be an element of uncertainty around the cut score. The standard error of measurement (SEM) for a BULATS test is about 4 point on the scale for the overall test result (0-100). We would therefore recommend that, when clients are making important decisions on the basis of these scores alone, they should allow a margin of 5 points on either side of any benchmark levels for the final overall score.
For greater certainty, more evidence of ability is required and clients are encouraged to use the productive skill modules. These are particularly suitable for discriminating at the top end of the scale (level 3-5). These modules provide an indication of strengths and weaknesses in different skills in addition to a band score.
Section scores are less precise than the overall score. For each section score, we recommend a margin of 5 points on either side of any benchmark levels. i.e. a margin of 5 points on the section represents a wider margin of uncertainty.
What is a 'good result'?
BULATS gives information on what level of ability each candidate has in the language tested. It does not in itself say what is a 'good' level. Candidates are placed in a framework of five levels, and descriptions of what each levels mean in practical terms - the 'Can-Do Statments - are provided to clients.
How the 'Can-Dos' be linked to test results? How can it be proved that someone at BULATS level 3 can typically carry out the same job tasks as someone with First Certificate in English (FCE), etc?
The 'Can-Do' statements were originally produced by analyzing the content of the Main Suite examinations and deducing which real-life skills would be compatible with the skills tested at each level.
As the result of extensive research, Cambridge ESOL has produced a Common Scale of Language Ability. This is a fixed scale - a yardstick - on which all measures of language ability can be placed.
Underlying this is the assumption that, despite variation in candidates' skills in particular areas, we can sum up their overall ability as a single score. Two candidates may have the same overall ability although one is stronger in speaking skills and the other is better at understanding reading texts.
It is then possible to look at the correlation between the scores candidates get on particular types of test and their overall score. From this analysis we find that certain types of test correlate highly with overall ability. Where possible, it is better to test all relevant skills; but where that is not feasible, we can make use of those types of test which correlate most highly with overall ability.
How many hours of teaching is necessary to go up one ALTE level?
It is impossible to give a clear answer to this because it depends on many variable:
- the intensity of the training (100 hours continuous training may be less effective than 100 hours part-time)
- the opportunity for practice outside the classroom training (especially whether the trainees are in a country or company where the target language is spoken, but also issues such as time spent in self-access centres, quantity of homework, etc.)
- the quality of the training
- the motivation and aptitude of the learner
- the starting level - beginners progress more quickly than advanced learners
As a rough guide, the Council of Europe suggests that the average learner should move up a level with 180-200 learning hours including independent work. A full time employee on a training course with 3-hour tuition and one or two hours 'homework' each week would probably take about one year to move up an ALTE level.
What level of English is required to understand the test?
The Computer-Adaptive Test contains a range of items at all levels. In general, the more difficult Reading and Language Knowledge questions are in Part 2 of the test. The Computer-Adaptive is adaptive and selects questions based on the previous answers a candidate gives.There is no ‘pass mark’: candidates are placed in one of six levels based on their score
What level does an executive need to have in order to work with a native speaker, particularly for speaking?
As a general guideline, and assuming that the executive needs to operate independently in a typical range of managerial tasks, we would advise that the learner should be level 3 or better, preferably with a minimum of level 4. This would almost certainly apply to speaking skills as well, though it may not apply to writing skills in all cases, e.g. where the manager only has a need to write short note-like-emails.
However, a more accurate answer would depend on the tasks involved and the degree of independence and responsibility the executive has in those tasks. Some examples of the type of task which we distinguish from the point of view of language level are:
- requesting work-related services
- providing work-related services
- participating in meetings and seminars
- following a demonstration
- giving a presentation
- understanding correspondence
- writing faxes, letters, etc.
- understanding reports, journals, etc.
- understanding notices and instructions
- taking phone messages
- making outgoing phone calls
- making travel and hotel arrangements
With more precise information about the tasks and responsibilities of the executive, it is possible to give a more precise and useful answer.
Do candidates have to wait for a certain time before re-sitting the test? If so,why?
There is no official minimum time lapse requirement before a candidate resits a BULATS test. However, we advise clients not to put candidates through the test again until they have been through at least 100 hours of training over a period of more than 3 months. If a candidate resits the test after a short period, they may get the same score or may even get a lower score.
Occasionally, candidates request to re-sit the test because they felt they underachieved significantly the first time round - because of illness or misunderstanding what was required of them, or some other reasons. This is a different situation, since the purpose of the second test is not to pick up any improvements in a short period. Therefore we would not particularly advise against this - though the client may, of course, not want to allow candidates this option
Is BULATS internationally recognized?
13,000 Organizations Worldwide Trust Cambridge ESOL. 3 million candidates sit Cambridge ESOL exams every year in 130 countries.
Employers around the world recognize BULATS (Business Language Testing Service) as proof of proficiency for their staff or new candidate’s ability to communicate effectively in English at work.
Corporations from diverse industries have employed BULATS, either for new staff recruitment or as part of their training regimen for on-going staff development programs.
To learn more about who's who among the world's "blue chip" corporations—organizations that have made BULATS an integral part of their human resource programs—download our BULATS Global Recognition brochure.
How do I decide whether to take the Speaking, Writing or Computer-Adaptive BULATS test or some combination of them?
A key feature of BULATS compared to other tests of English is that it offers you flexibility. Ideally, it is recommended to use a combination of the BULATS Computer-Adaptive test with the Speaking and/or Writing Test to offer a more complete picture of the four language skills that make up a person’s English competence.
In making your decision it is important that you consider what you want to use the test for. For example, some organisations use the Computer-Adaptive test at an early stage in the recruitment of staff to check on a minimum English competence of a large number of applicants. They may then use the Speaking and/or Writing Tests in the final selection from a few candidates. If a post requires a high level of writing competence for emails or reports then it makes sense to use the Writing Test. Similarly, if a person needs to speak English to deal with customer enquiries or in working with English speaking staff then use the Speaking Test.
BULATS has been carefully designed to offer a premium quality service with the reliability of a double examiner marking system, flexibility in its test dates and venue, and a fast turnaround of results. The costs involved in maintaining this service point to BULATS inevitably being priced at the top end of the market. A key question is the value an organisation places on raising the standard of English of its staff. BULATS offers a quality option.
How do BULATS scores compare to those of other international English test?
BULATS focuses specifically on modern workplace English and on the effectiveness of a person’s ability to communicate in a real situation. Many other tests do not. Trying to compare different English test scores is therefore like trying to compare apples with oranges. However, underpinning the design of BULATS is the carefully researched Association of Language Testers in Europe (ALTE) set of linguistic criteria.
A copy of this alignment chart can be downloaded at the bottom of this page..
Who are the examiners for the speaking and writing tests?
Vantage Siam’s Speaking and Writing examiners are carefully selected, trained and finally, Cambridge ESOL-certified.
Examiners must have native speaker English competence, a recognised international qualification as a teacher of English as a foreign/second language, and at least 3 years relevant, practical teaching experience. Vantage’s examiners also hold a college degree to at least the bachelor’s level and many hold post graduate qualifications.
All of Vantage's examiners have successfully completed an intense training program which includes submission of standardised scores to Cambridge ESOL in the UK for verification.
In Thailand, Who Uses BULATS?
Proof of Proficiency
Cambridge ESOL certificates are trusted by over 12,000 organizations globally as demonstrable proof of the holder’s ability to communicate effectively in English at work.
In Thailand, BULATS is a qualified exam for the United Kingdom’s Border Agency for language proficiency certification for tier-based and settlement visas.
Many of Thailand’s blue-chip corporations employ BULATS as a key management benchmark for staff recruitment and as an innovative tool of thier Human Resource training and staff development programs. Innovative Thai companies are increasingly looking at markets beyond Siam and want to position their products in a global marketplace.
Some to these successful companies include:
DST WorldWide Services
Bayer Materials Science
American International Assurance Ltd
Advance Agro Co. Ltd
Boyden Associates (Thailand) Ltd.
British American Tobacco (Thailand)
TN Information Systems Ltd
New Hampshire Insurance Company
Property Care Services (PCS)