Back to quiz

6. What is typically used in modelling?

  • Cortical columns
  • 3 layer architecture
  • Mono layers
  • Dual layers

7. What is the core premise of neural computation?

  • Neurons create chains of activation seen in the mappings and organisation of the cortex
  • A specific pattern of activity in a single neuron induces a specific pattern of activation in another
  • A specific pattern of activity in one pattern of neurons induces a specific pattern of activation in another
  • Neurons represent information much in the same way as bytes in a computer. Modelling and computations as a simulation of thought

8. What is meant by learning?

  • Finding a set of connection weights to make the network useful by making systematic changes to the weights in response to input patterns
  • A concept analogous to the total strength of all inhibitory and all excitatory connections between the neurons
  • Characterised by 'neurons that fire together wire together'
  • Neural networks can be connected to transform ANY set of input patterns into ANY set of output patterns

9. Can deep learning approach and exceed human knowledge on tasks such as face and object recognition?

  • Yes
  • No

10. What would happen if there were no competition rules?

  • The same units would win again and again and become stronger but would not be selective for diff inputs. Other units would never win
  • The network would break with a large enough input
  • The network would not be able to cope with variable stimuli as the inpit nodes would be oversaturated

11. What are the three features of Selfridges (1959) pandemonium model of letter recognition?

  • Decision 'demon' -> feature 'demon' -> cognitive 'demon'
  • Feature 'demon' -> cognitive 'demon' -> decision 'demon
  • Decision 'demon' -> cognitive 'demon' -> feature 'demon'
  • Cognitive 'demon' -> feature 'demon' -> decision 'demon'

12. What is Hebbian learning which underlies the core idea of weight changes?

  • Characterised by 'neurons that fire together wire together'
  • Finding a set of connection weights to make the network useful by making systematic changes to the weights in response to input patterns
  • A concept analogous to the total strength of all inhibitory and all excitatory connections between the neurons
  • Neural networks can be connected to transform ANY set of input patterns into ANY set of output patterns

13. What variant of learning does deep learning use?

  • Supervised learning
  • Unsupervised learning
  • Competitive learning

14. Do neural networks start with initially random weights?

  • Yes
  • No

15. Which of these characterises unsupervised learning?

  • Network learns to identify clusters/statistical structures in the input. Output neurons compete to respond to each input pattern (lateral inhibition)
  • Network learns by discovering the inherent structure in the input. Based heavily on hebbian learning
  • Network learns by being presented with example input patterns and the desired output pattern. Initial output is different but is gradually and systematically changed to reduce discrepancy.

16. In supervised learning, what happens after each input pattern?

  • Network checks the output against the underlying structure of the input to attempt to reduce discrepancy in the whole model
  • Connections are strengthened as to reduce contributions to discrepancy between observed and desired output
  • Connections are weakened/strengthened as to reduce contributions to discrepancy between observed and desired output
  • Connections are weakened as to reduce contributions to discrepancy between observed and desired output

17. What is the concept of generality?

  • Neural networks can be connected to transform ANY set of input patterns into ANY set of output patterns
  • A concept analogous to the total strength of all inhibitory and all excitatory connections between the neurons
  • Finding a set of connection weights to make the network useful by making systematic changes to the weights in response to input patterns
  • Characterised by 'neurons that fire together wire together'

18. What are the two criteria for generality?

  • Each input pattern maps onto ANY output pattern and there are enough units (4) and layers of units (unlimited)
  • Each input pattern maps onto ANY output pattern and there are enough units (unlimited) and layers of units (4)
  • Each input pattern maps onto only ONE output pattern and there are enough units (unlimited) and layers of units (4)
  • Each input pattern maps onto only ONE output pattern and there are enough units (4) and layers of units (unlimited)

19. Which is correct?

  • (sum of)Activation of input neuron x output
  • (sum of)Activation of input neuron -output
  • (sum of)Activation of input neuron x weight of input
  • (sum of)Activation of input neuron - weight of input

20. Which of these characterises supervised learning?

  • Network learns to identify clusters/statistical structures in the input. Output neurons compete to respond to each input pattern (lateral inhibition)
  • Network learns by being presented with example input patterns and the desired output pattern. Initial output is different but is gradually and systematically changed to reduce discrepancy.
  • Network learns by discovering the inherent structure in the input. Based heavily on hebbian learning